Jan 31 09:04:48 crc systemd[1]: Starting Kubernetes Kubelet... Jan 31 09:04:48 crc restorecon[4582]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:48 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:04:49 crc restorecon[4582]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:04:49 crc restorecon[4582]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 31 09:04:49 crc kubenswrapper[4783]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 09:04:49 crc kubenswrapper[4783]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 31 09:04:49 crc kubenswrapper[4783]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 09:04:49 crc kubenswrapper[4783]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 09:04:49 crc kubenswrapper[4783]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 31 09:04:49 crc kubenswrapper[4783]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.521290 4783 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523361 4783 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523378 4783 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523383 4783 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523387 4783 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523391 4783 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523394 4783 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523398 4783 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523402 4783 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523409 4783 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523412 4783 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523417 4783 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523421 4783 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523426 4783 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523429 4783 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523433 4783 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523437 4783 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523441 4783 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523445 4783 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523448 4783 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523453 4783 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523456 4783 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523461 4783 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523465 4783 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523468 4783 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523471 4783 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523475 4783 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523478 4783 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523481 4783 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523485 4783 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523488 4783 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523492 4783 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523496 4783 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523499 4783 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523503 4783 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523506 4783 feature_gate.go:330] unrecognized feature gate: Example Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523509 4783 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523513 4783 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523517 4783 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523520 4783 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523523 4783 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523528 4783 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523532 4783 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523535 4783 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523538 4783 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523541 4783 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523545 4783 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523548 4783 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523552 4783 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523555 4783 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523559 4783 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523562 4783 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523566 4783 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523570 4783 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523573 4783 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523576 4783 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523580 4783 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523583 4783 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523586 4783 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523589 4783 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523593 4783 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523596 4783 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523599 4783 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523602 4783 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523606 4783 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523610 4783 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523614 4783 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523617 4783 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523621 4783 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523624 4783 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523629 4783 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.523634 4783 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524040 4783 flags.go:64] FLAG: --address="0.0.0.0" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524051 4783 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524059 4783 flags.go:64] FLAG: --anonymous-auth="true" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524064 4783 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524069 4783 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524073 4783 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524078 4783 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524086 4783 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524090 4783 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524094 4783 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524098 4783 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524102 4783 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524106 4783 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524109 4783 flags.go:64] FLAG: --cgroup-root="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524113 4783 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524117 4783 flags.go:64] FLAG: --client-ca-file="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524120 4783 flags.go:64] FLAG: --cloud-config="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524124 4783 flags.go:64] FLAG: --cloud-provider="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524127 4783 flags.go:64] FLAG: --cluster-dns="[]" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524132 4783 flags.go:64] FLAG: --cluster-domain="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524136 4783 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524140 4783 flags.go:64] FLAG: --config-dir="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524144 4783 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524148 4783 flags.go:64] FLAG: --container-log-max-files="5" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524153 4783 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524170 4783 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524175 4783 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524179 4783 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524183 4783 flags.go:64] FLAG: --contention-profiling="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524189 4783 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524193 4783 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524196 4783 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524202 4783 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524206 4783 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524210 4783 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524214 4783 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524218 4783 flags.go:64] FLAG: --enable-load-reader="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524221 4783 flags.go:64] FLAG: --enable-server="true" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524225 4783 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524231 4783 flags.go:64] FLAG: --event-burst="100" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524235 4783 flags.go:64] FLAG: --event-qps="50" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524238 4783 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524242 4783 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524246 4783 flags.go:64] FLAG: --eviction-hard="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524251 4783 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524254 4783 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524258 4783 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524262 4783 flags.go:64] FLAG: --eviction-soft="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524266 4783 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524270 4783 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524273 4783 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524278 4783 flags.go:64] FLAG: --experimental-mounter-path="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524281 4783 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524285 4783 flags.go:64] FLAG: --fail-swap-on="true" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524289 4783 flags.go:64] FLAG: --feature-gates="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524293 4783 flags.go:64] FLAG: --file-check-frequency="20s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524297 4783 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524301 4783 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524305 4783 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524309 4783 flags.go:64] FLAG: --healthz-port="10248" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524313 4783 flags.go:64] FLAG: --help="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524317 4783 flags.go:64] FLAG: --hostname-override="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524321 4783 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524325 4783 flags.go:64] FLAG: --http-check-frequency="20s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524329 4783 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524333 4783 flags.go:64] FLAG: --image-credential-provider-config="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524336 4783 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524340 4783 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524349 4783 flags.go:64] FLAG: --image-service-endpoint="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524353 4783 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524357 4783 flags.go:64] FLAG: --kube-api-burst="100" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524361 4783 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524365 4783 flags.go:64] FLAG: --kube-api-qps="50" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524369 4783 flags.go:64] FLAG: --kube-reserved="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524373 4783 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524377 4783 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524380 4783 flags.go:64] FLAG: --kubelet-cgroups="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524384 4783 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524388 4783 flags.go:64] FLAG: --lock-file="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524391 4783 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524395 4783 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524399 4783 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524404 4783 flags.go:64] FLAG: --log-json-split-stream="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524408 4783 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524411 4783 flags.go:64] FLAG: --log-text-split-stream="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524415 4783 flags.go:64] FLAG: --logging-format="text" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524419 4783 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524423 4783 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524427 4783 flags.go:64] FLAG: --manifest-url="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524432 4783 flags.go:64] FLAG: --manifest-url-header="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524436 4783 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524440 4783 flags.go:64] FLAG: --max-open-files="1000000" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524445 4783 flags.go:64] FLAG: --max-pods="110" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524449 4783 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524453 4783 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524456 4783 flags.go:64] FLAG: --memory-manager-policy="None" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524460 4783 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524464 4783 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524467 4783 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524471 4783 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524480 4783 flags.go:64] FLAG: --node-status-max-images="50" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524484 4783 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524487 4783 flags.go:64] FLAG: --oom-score-adj="-999" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524491 4783 flags.go:64] FLAG: --pod-cidr="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524495 4783 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524501 4783 flags.go:64] FLAG: --pod-manifest-path="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524505 4783 flags.go:64] FLAG: --pod-max-pids="-1" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524508 4783 flags.go:64] FLAG: --pods-per-core="0" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524512 4783 flags.go:64] FLAG: --port="10250" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524516 4783 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524520 4783 flags.go:64] FLAG: --provider-id="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524523 4783 flags.go:64] FLAG: --qos-reserved="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524527 4783 flags.go:64] FLAG: --read-only-port="10255" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524531 4783 flags.go:64] FLAG: --register-node="true" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524534 4783 flags.go:64] FLAG: --register-schedulable="true" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524538 4783 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524544 4783 flags.go:64] FLAG: --registry-burst="10" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524548 4783 flags.go:64] FLAG: --registry-qps="5" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524551 4783 flags.go:64] FLAG: --reserved-cpus="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524555 4783 flags.go:64] FLAG: --reserved-memory="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524559 4783 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524563 4783 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524567 4783 flags.go:64] FLAG: --rotate-certificates="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524571 4783 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524574 4783 flags.go:64] FLAG: --runonce="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524578 4783 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524582 4783 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524586 4783 flags.go:64] FLAG: --seccomp-default="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524590 4783 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524593 4783 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524597 4783 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524601 4783 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524604 4783 flags.go:64] FLAG: --storage-driver-password="root" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524608 4783 flags.go:64] FLAG: --storage-driver-secure="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524612 4783 flags.go:64] FLAG: --storage-driver-table="stats" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524615 4783 flags.go:64] FLAG: --storage-driver-user="root" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524619 4783 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524623 4783 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524627 4783 flags.go:64] FLAG: --system-cgroups="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524630 4783 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524636 4783 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524639 4783 flags.go:64] FLAG: --tls-cert-file="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524643 4783 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524647 4783 flags.go:64] FLAG: --tls-min-version="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524651 4783 flags.go:64] FLAG: --tls-private-key-file="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524654 4783 flags.go:64] FLAG: --topology-manager-policy="none" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524657 4783 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524661 4783 flags.go:64] FLAG: --topology-manager-scope="container" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524665 4783 flags.go:64] FLAG: --v="2" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524669 4783 flags.go:64] FLAG: --version="false" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524673 4783 flags.go:64] FLAG: --vmodule="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524678 4783 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.524682 4783 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524772 4783 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524777 4783 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524780 4783 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524783 4783 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524787 4783 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524791 4783 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524794 4783 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524797 4783 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524800 4783 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524803 4783 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524806 4783 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524809 4783 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524812 4783 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524815 4783 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524820 4783 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524824 4783 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524828 4783 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524831 4783 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524835 4783 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524838 4783 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524842 4783 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524845 4783 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524849 4783 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524857 4783 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524861 4783 feature_gate.go:330] unrecognized feature gate: Example Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524864 4783 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524868 4783 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524871 4783 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524874 4783 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524877 4783 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524881 4783 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524884 4783 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524887 4783 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524891 4783 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524894 4783 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524898 4783 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524902 4783 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524911 4783 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524915 4783 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524918 4783 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524922 4783 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524926 4783 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524929 4783 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524932 4783 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524935 4783 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524938 4783 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524941 4783 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524944 4783 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524948 4783 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524951 4783 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524954 4783 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524957 4783 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524960 4783 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524963 4783 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524966 4783 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524969 4783 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524973 4783 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524976 4783 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524979 4783 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524990 4783 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524993 4783 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.524997 4783 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.525000 4783 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.525003 4783 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.525006 4783 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.525011 4783 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.525025 4783 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.525029 4783 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.525032 4783 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.525038 4783 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.525041 4783 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.525582 4783 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.531429 4783 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.531457 4783 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531523 4783 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531531 4783 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531537 4783 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531542 4783 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531546 4783 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531550 4783 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531553 4783 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531557 4783 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531560 4783 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531563 4783 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531567 4783 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531570 4783 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531573 4783 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531576 4783 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531580 4783 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531583 4783 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531586 4783 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531589 4783 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531592 4783 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531595 4783 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531598 4783 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531601 4783 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531604 4783 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531608 4783 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531611 4783 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531615 4783 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531619 4783 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531622 4783 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531625 4783 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531629 4783 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531632 4783 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531635 4783 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531638 4783 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531641 4783 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531646 4783 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531649 4783 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531652 4783 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531656 4783 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531659 4783 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531662 4783 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531666 4783 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531670 4783 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531674 4783 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531677 4783 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531680 4783 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531683 4783 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531686 4783 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531690 4783 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531693 4783 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531696 4783 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531699 4783 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531703 4783 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531706 4783 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531710 4783 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531713 4783 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531716 4783 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531719 4783 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531722 4783 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531725 4783 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531729 4783 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531732 4783 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531735 4783 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531738 4783 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531742 4783 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531746 4783 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531749 4783 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531753 4783 feature_gate.go:330] unrecognized feature gate: Example Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531757 4783 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531760 4783 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531763 4783 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531767 4783 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.531773 4783 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531888 4783 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531894 4783 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531898 4783 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531902 4783 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531905 4783 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531909 4783 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531912 4783 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531915 4783 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531919 4783 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531922 4783 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531925 4783 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531928 4783 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531931 4783 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531934 4783 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531937 4783 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531940 4783 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531944 4783 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531947 4783 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531950 4783 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531953 4783 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531956 4783 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531960 4783 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531963 4783 feature_gate.go:330] unrecognized feature gate: Example Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531966 4783 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531969 4783 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531972 4783 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531975 4783 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531978 4783 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531981 4783 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531985 4783 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531988 4783 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531991 4783 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531994 4783 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.531997 4783 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532001 4783 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532005 4783 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532009 4783 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532026 4783 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532029 4783 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532032 4783 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532035 4783 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532040 4783 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532044 4783 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532048 4783 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532051 4783 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532055 4783 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532058 4783 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532061 4783 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532065 4783 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532068 4783 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532071 4783 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532074 4783 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532077 4783 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532082 4783 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532085 4783 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532089 4783 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532094 4783 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532097 4783 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532101 4783 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532104 4783 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532108 4783 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532111 4783 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532115 4783 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532118 4783 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532122 4783 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532125 4783 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532129 4783 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532132 4783 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532135 4783 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532139 4783 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.532142 4783 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.532148 4783 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.532294 4783 server.go:940] "Client rotation is on, will bootstrap in background" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.534923 4783 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.534997 4783 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.535918 4783 server.go:997] "Starting client certificate rotation" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.535942 4783 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.536151 4783 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-12 15:32:31.826236429 +0000 UTC Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.536264 4783 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.547906 4783 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 31 09:04:49 crc kubenswrapper[4783]: E0131 09:04:49.549470 4783 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.26.246:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.550356 4783 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.558031 4783 log.go:25] "Validated CRI v1 runtime API" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.574510 4783 log.go:25] "Validated CRI v1 image API" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.575717 4783 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.578269 4783 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-31-09-01-20-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.578310 4783 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.593407 4783 manager.go:217] Machine: {Timestamp:2026-01-31 09:04:49.591543509 +0000 UTC m=+0.260226977 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2445404 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:fb2fc674-10e7-4f52-98ab-a2501c80635b BootID:acd87756-2b8a-4238-9ef4-5b9ef00df1bf Filesystems:[{Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:39:d6:02 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:39:d6:02 Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:04:57:95 Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:94:ef:c2 Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:88:78:be Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:10:b5:fc Speed:-1 Mtu:1436} {Name:eth10 MacAddress:3a:0b:a4:da:12:50 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d2:c0:02:63:d2:7d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:65536 Type:Data Level:1} {Id:10 Size:65536 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:65536 Type:Data Level:1} {Id:11 Size:65536 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:65536 Type:Data Level:1} {Id:8 Size:65536 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:65536 Type:Data Level:1} {Id:9 Size:65536 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.593573 4783 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.593668 4783 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.593886 4783 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.594072 4783 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.594101 4783 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.594263 4783 topology_manager.go:138] "Creating topology manager with none policy" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.594272 4783 container_manager_linux.go:303] "Creating device plugin manager" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.594579 4783 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.594603 4783 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.594973 4783 state_mem.go:36] "Initialized new in-memory state store" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.595223 4783 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.597423 4783 kubelet.go:418] "Attempting to sync node with API server" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.597456 4783 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.597483 4783 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.597492 4783 kubelet.go:324] "Adding apiserver pod source" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.597504 4783 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.599524 4783 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.599885 4783 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.246:6443: connect: connection refused Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.599890 4783 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.246:6443: connect: connection refused Jan 31 09:04:49 crc kubenswrapper[4783]: E0131 09:04:49.599933 4783 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.246:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:04:49 crc kubenswrapper[4783]: E0131 09:04:49.599947 4783 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.246:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.600187 4783 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.601727 4783 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.602626 4783 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.602649 4783 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.602657 4783 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.602665 4783 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.602678 4783 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.602684 4783 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.602690 4783 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.602700 4783 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.602709 4783 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.602717 4783 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.602726 4783 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.602733 4783 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.603190 4783 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.603546 4783 server.go:1280] "Started kubelet" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.603913 4783 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.246:6443: connect: connection refused Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.604037 4783 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.604038 4783 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 31 09:04:49 crc systemd[1]: Started Kubernetes Kubelet. Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.604612 4783 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.605610 4783 server.go:460] "Adding debug handlers to kubelet server" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.605837 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.606083 4783 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.606187 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 13:41:28.719645163 +0000 UTC Jan 31 09:04:49 crc kubenswrapper[4783]: E0131 09:04:49.606234 4783 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.606773 4783 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.606840 4783 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.606910 4783 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 31 09:04:49 crc kubenswrapper[4783]: E0131 09:04:49.606401 4783 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.26.246:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188fc5761970a27a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 09:04:49.603519098 +0000 UTC m=+0.272202566,LastTimestamp:2026-01-31 09:04:49.603519098 +0000 UTC m=+0.272202566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.607599 4783 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.246:6443: connect: connection refused Jan 31 09:04:49 crc kubenswrapper[4783]: E0131 09:04:49.607661 4783 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.246:6443: connect: connection refused" interval="200ms" Jan 31 09:04:49 crc kubenswrapper[4783]: E0131 09:04:49.608031 4783 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.246:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.612756 4783 factory.go:55] Registering systemd factory Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.612776 4783 factory.go:221] Registration of the systemd container factory successfully Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.613423 4783 factory.go:153] Registering CRI-O factory Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.613446 4783 factory.go:221] Registration of the crio container factory successfully Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.613512 4783 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.613556 4783 factory.go:103] Registering Raw factory Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.613579 4783 manager.go:1196] Started watching for new ooms in manager Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.614362 4783 manager.go:319] Starting recovery of all containers Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.615136 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.615251 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.615338 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.615418 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.615473 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.615529 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.615581 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.615638 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.615698 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.615751 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.615798 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.615851 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.615898 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.615970 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.616057 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.616108 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.616179 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.616235 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.616292 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.616394 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.616447 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.616497 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.616553 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.616611 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.616669 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.616720 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.616773 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.616824 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.616872 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.616929 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617005 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617074 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617121 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617215 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617268 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617330 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617381 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617429 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617476 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617523 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617577 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617628 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617678 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617726 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617774 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617831 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617880 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.617940 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.618037 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.618095 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.618153 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.618237 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.618293 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.618343 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.618393 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.618443 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.618497 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.618551 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.618611 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.618665 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.618772 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.618834 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.618902 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.618957 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.619059 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.619117 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.619201 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.619273 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.619327 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.619384 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.619448 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.619510 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.619564 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.619616 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.619672 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.619729 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.619780 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.619838 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.619896 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.619958 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620022 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620077 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620135 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620215 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620278 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620357 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620412 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620467 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620539 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620590 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620644 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620694 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620740 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620795 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620862 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620914 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.620984 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.621064 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.621140 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.621220 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.621282 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.621337 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.621389 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.621438 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.621526 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.621584 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.621634 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.621681 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.621730 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.621792 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.621871 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.621944 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.621996 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.622068 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.622125 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.622206 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.622263 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.622331 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.622381 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.622447 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.622966 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.623048 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.623101 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.623149 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.623274 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.623376 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.623456 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.623508 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.623557 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.623604 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.623656 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.623704 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.623762 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.623829 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.623940 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.624004 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.624145 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.624215 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.624266 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.624312 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.624357 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.624467 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.624530 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.624600 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.624648 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.624694 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.624745 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.624793 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.624838 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.624920 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.624967 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.625024 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.625106 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.625174 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.625228 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.625277 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.625325 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.625453 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.625530 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.625578 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.625624 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.625680 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.625757 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.625816 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.625869 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.625942 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.626958 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.626974 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.626984 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.626995 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627005 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627032 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627073 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627139 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627181 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627203 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627212 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627221 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627236 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627244 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627253 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627285 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627293 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627302 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627310 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627319 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627327 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627337 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627346 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627370 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627428 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627438 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627446 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627455 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627463 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627473 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627482 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627502 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627511 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627519 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627530 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627539 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627549 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627558 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.627587 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.628432 4783 manager.go:324] Recovery completed Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.628631 4783 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.628682 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.628694 4783 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.628703 4783 reconstruct.go:97] "Volume reconstruction finished" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.628710 4783 reconciler.go:26] "Reconciler: start to sync state" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.635952 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.637439 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.637467 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.637476 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.638084 4783 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.638100 4783 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.638114 4783 state_mem.go:36] "Initialized new in-memory state store" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.642118 4783 policy_none.go:49] "None policy: Start" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.643247 4783 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.643601 4783 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.643618 4783 state_mem.go:35] "Initializing new in-memory state store" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.644408 4783 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.644449 4783 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.644471 4783 kubelet.go:2335] "Starting kubelet main sync loop" Jan 31 09:04:49 crc kubenswrapper[4783]: E0131 09:04:49.644505 4783 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 31 09:04:49 crc kubenswrapper[4783]: W0131 09:04:49.646340 4783 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.246:6443: connect: connection refused Jan 31 09:04:49 crc kubenswrapper[4783]: E0131 09:04:49.646406 4783 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.246:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.685853 4783 manager.go:334] "Starting Device Plugin manager" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.685897 4783 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.685909 4783 server.go:79] "Starting device plugin registration server" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.686154 4783 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.686190 4783 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.686291 4783 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.686399 4783 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.686413 4783 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 31 09:04:49 crc kubenswrapper[4783]: E0131 09:04:49.692298 4783 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.745156 4783 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.745289 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.745959 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.745995 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.746015 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.746181 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.746912 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.746945 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.746955 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.747047 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.747064 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.747086 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.747248 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.747286 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.747688 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.747719 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.747729 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.747840 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.747897 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.747936 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.747946 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.747971 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.747995 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.748048 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.748092 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.748103 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.748580 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.748596 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.748605 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.748625 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.748643 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.748662 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.748755 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.748848 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.748876 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.749282 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.749304 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.749312 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.749385 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.749404 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.749413 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.749418 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.749437 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.749891 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.749916 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.749925 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.786416 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.786898 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.786927 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.786936 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.786951 4783 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:04:49 crc kubenswrapper[4783]: E0131 09:04:49.787366 4783 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.246:6443: connect: connection refused" node="crc" Jan 31 09:04:49 crc kubenswrapper[4783]: E0131 09:04:49.808100 4783 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.246:6443: connect: connection refused" interval="400ms" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.830241 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.830294 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.830322 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.830338 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.830377 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.830398 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.830449 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.830464 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.830481 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.830516 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.830530 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.830543 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.830558 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.830669 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.830692 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931711 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931743 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931760 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931777 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931805 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931819 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931831 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931843 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931856 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931857 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931910 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931918 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931935 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931884 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931958 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931977 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931977 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931998 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931961 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.932015 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931993 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.932020 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.932039 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.931961 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.932071 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.932083 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.932089 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.932088 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.932115 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.932247 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.987744 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.989130 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.989179 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.989191 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:49 crc kubenswrapper[4783]: I0131 09:04:49.989214 4783 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:04:49 crc kubenswrapper[4783]: E0131 09:04:49.989580 4783 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.246:6443: connect: connection refused" node="crc" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.073570 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.079205 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.090954 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:04:50 crc kubenswrapper[4783]: W0131 09:04:50.099732 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-07cb8859bbf8d4d2ed62108a61e07eaad11e124bd79126f00fccd63aa940967d WatchSource:0}: Error finding container 07cb8859bbf8d4d2ed62108a61e07eaad11e124bd79126f00fccd63aa940967d: Status 404 returned error can't find the container with id 07cb8859bbf8d4d2ed62108a61e07eaad11e124bd79126f00fccd63aa940967d Jan 31 09:04:50 crc kubenswrapper[4783]: W0131 09:04:50.101525 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-939c1636cf121188475c7c77cc38f89af0d4959089c0293135884cf648cbe5f2 WatchSource:0}: Error finding container 939c1636cf121188475c7c77cc38f89af0d4959089c0293135884cf648cbe5f2: Status 404 returned error can't find the container with id 939c1636cf121188475c7c77cc38f89af0d4959089c0293135884cf648cbe5f2 Jan 31 09:04:50 crc kubenswrapper[4783]: W0131 09:04:50.105158 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-7f174d81b1eea7a8e8c6bf7cf3e2bc708a42e5ad9d65f05a744e5a467ba02d9b WatchSource:0}: Error finding container 7f174d81b1eea7a8e8c6bf7cf3e2bc708a42e5ad9d65f05a744e5a467ba02d9b: Status 404 returned error can't find the container with id 7f174d81b1eea7a8e8c6bf7cf3e2bc708a42e5ad9d65f05a744e5a467ba02d9b Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.107873 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.111249 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:04:50 crc kubenswrapper[4783]: W0131 09:04:50.120393 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-e8f8bea757ccff83634fefbcd30c258bc0bb1f982a7d488a8142310715774a86 WatchSource:0}: Error finding container e8f8bea757ccff83634fefbcd30c258bc0bb1f982a7d488a8142310715774a86: Status 404 returned error can't find the container with id e8f8bea757ccff83634fefbcd30c258bc0bb1f982a7d488a8142310715774a86 Jan 31 09:04:50 crc kubenswrapper[4783]: W0131 09:04:50.124245 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-7cc1c7b3f56af59db3549efe86c61c8ba2fd2de6c62d1c0dc9ed56ad9c9f0e25 WatchSource:0}: Error finding container 7cc1c7b3f56af59db3549efe86c61c8ba2fd2de6c62d1c0dc9ed56ad9c9f0e25: Status 404 returned error can't find the container with id 7cc1c7b3f56af59db3549efe86c61c8ba2fd2de6c62d1c0dc9ed56ad9c9f0e25 Jan 31 09:04:50 crc kubenswrapper[4783]: E0131 09:04:50.209002 4783 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.246:6443: connect: connection refused" interval="800ms" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.390504 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.391368 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.391416 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.391426 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.391449 4783 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:04:50 crc kubenswrapper[4783]: E0131 09:04:50.391857 4783 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.246:6443: connect: connection refused" node="crc" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.605313 4783 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.246:6443: connect: connection refused Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.606356 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 09:41:42.123781196 +0000 UTC Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.650498 4783 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa" exitCode=0 Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.650557 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa"} Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.650637 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"939c1636cf121188475c7c77cc38f89af0d4959089c0293135884cf648cbe5f2"} Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.650722 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.651808 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.651837 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.651791 4783 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="ef7edee2ca6f502b42b699e110ef484b0f0da86cf3d2015cf8321e2e8864f7fc" exitCode=0 Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.651846 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.651968 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"ef7edee2ca6f502b42b699e110ef484b0f0da86cf3d2015cf8321e2e8864f7fc"} Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.652051 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7cc1c7b3f56af59db3549efe86c61c8ba2fd2de6c62d1c0dc9ed56ad9c9f0e25"} Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.652157 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.652837 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.652908 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.652967 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.653872 4783 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579" exitCode=0 Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.653934 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579"} Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.653950 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e8f8bea757ccff83634fefbcd30c258bc0bb1f982a7d488a8142310715774a86"} Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.654008 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.654664 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.654775 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.654790 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.655181 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932"} Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.655203 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7f174d81b1eea7a8e8c6bf7cf3e2bc708a42e5ad9d65f05a744e5a467ba02d9b"} Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.656515 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef"} Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.656594 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.656442 4783 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef" exitCode=0 Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.656622 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"07cb8859bbf8d4d2ed62108a61e07eaad11e124bd79126f00fccd63aa940967d"} Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.657962 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.658037 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.658118 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.660580 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.661312 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.661341 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:50 crc kubenswrapper[4783]: I0131 09:04:50.661351 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:51 crc kubenswrapper[4783]: E0131 09:04:51.009986 4783 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.246:6443: connect: connection refused" interval="1.6s" Jan 31 09:04:51 crc kubenswrapper[4783]: W0131 09:04:51.036701 4783 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.246:6443: connect: connection refused Jan 31 09:04:51 crc kubenswrapper[4783]: E0131 09:04:51.036764 4783 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.246:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:04:51 crc kubenswrapper[4783]: W0131 09:04:51.097486 4783 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.246:6443: connect: connection refused Jan 31 09:04:51 crc kubenswrapper[4783]: E0131 09:04:51.097545 4783 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.246:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:04:51 crc kubenswrapper[4783]: W0131 09:04:51.116466 4783 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.246:6443: connect: connection refused Jan 31 09:04:51 crc kubenswrapper[4783]: E0131 09:04:51.116513 4783 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.246:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:04:51 crc kubenswrapper[4783]: W0131 09:04:51.170314 4783 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.246:6443: connect: connection refused Jan 31 09:04:51 crc kubenswrapper[4783]: E0131 09:04:51.170378 4783 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.246:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.192571 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.193508 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.193540 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.193562 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.193584 4783 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:04:51 crc kubenswrapper[4783]: E0131 09:04:51.193887 4783 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.246:6443: connect: connection refused" node="crc" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.607501 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 21:52:02.810367916 +0000 UTC Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.657569 4783 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.660590 4783 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9" exitCode=0 Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.660647 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9"} Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.660746 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.661467 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.661496 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.661505 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.662357 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"052e53334214d5a07851eed05c78b6d3530aa8aacea9f074be93c1305a991c3b"} Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.662404 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.663010 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.663034 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.663042 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.664285 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"aae0cca9ef3d64d4323f775d56154dd4d95078b142c429e8431580734b7fa739"} Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.664334 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3ab2f851c6a3148330f3e19352e904c099dc3b3e68f664cbc62aec049e50e5a0"} Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.664345 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"acd5c1f71bcf95041d31c6c0858e8f6b02aac2d44e208a25228924199cfa8527"} Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.664431 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.665133 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.665155 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.665179 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.667208 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9"} Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.667247 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1"} Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.667259 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68"} Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.667222 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.667870 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.667895 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.667905 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.669183 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b"} Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.669213 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406"} Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.669228 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7"} Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.669236 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410"} Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.669244 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357"} Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.669261 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.669739 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.669768 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:51 crc kubenswrapper[4783]: I0131 09:04:51.669776 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.607685 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 02:42:32.695823457 +0000 UTC Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.672982 4783 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926" exitCode=0 Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.673066 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926"} Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.673112 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.673139 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.673215 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.673681 4783 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.673724 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.674017 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.674044 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.674052 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.674021 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.674091 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.674099 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.674051 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.674359 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.674374 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.674383 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.674374 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.674422 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.794003 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.794697 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.794732 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.794743 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.794771 4783 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:04:52 crc kubenswrapper[4783]: I0131 09:04:52.934103 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:04:53 crc kubenswrapper[4783]: I0131 09:04:53.607863 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 09:49:50.777205593 +0000 UTC Jan 31 09:04:53 crc kubenswrapper[4783]: I0131 09:04:53.679091 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425"} Jan 31 09:04:53 crc kubenswrapper[4783]: I0131 09:04:53.679141 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1"} Jan 31 09:04:53 crc kubenswrapper[4783]: I0131 09:04:53.679148 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:53 crc kubenswrapper[4783]: I0131 09:04:53.679153 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d"} Jan 31 09:04:53 crc kubenswrapper[4783]: I0131 09:04:53.679271 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5"} Jan 31 09:04:53 crc kubenswrapper[4783]: I0131 09:04:53.679302 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25"} Jan 31 09:04:53 crc kubenswrapper[4783]: I0131 09:04:53.679284 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:53 crc kubenswrapper[4783]: I0131 09:04:53.680032 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:53 crc kubenswrapper[4783]: I0131 09:04:53.680059 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:53 crc kubenswrapper[4783]: I0131 09:04:53.680068 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:53 crc kubenswrapper[4783]: I0131 09:04:53.680624 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:53 crc kubenswrapper[4783]: I0131 09:04:53.680655 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:53 crc kubenswrapper[4783]: I0131 09:04:53.680665 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:54 crc kubenswrapper[4783]: I0131 09:04:54.118530 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 31 09:04:54 crc kubenswrapper[4783]: I0131 09:04:54.607945 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 10:20:20.160669575 +0000 UTC Jan 31 09:04:54 crc kubenswrapper[4783]: I0131 09:04:54.681385 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:54 crc kubenswrapper[4783]: I0131 09:04:54.682014 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:54 crc kubenswrapper[4783]: I0131 09:04:54.682038 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:54 crc kubenswrapper[4783]: I0131 09:04:54.682046 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:54 crc kubenswrapper[4783]: I0131 09:04:54.903136 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:04:54 crc kubenswrapper[4783]: I0131 09:04:54.903355 4783 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:04:54 crc kubenswrapper[4783]: I0131 09:04:54.903399 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:54 crc kubenswrapper[4783]: I0131 09:04:54.904340 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:54 crc kubenswrapper[4783]: I0131 09:04:54.904368 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:54 crc kubenswrapper[4783]: I0131 09:04:54.904376 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:55 crc kubenswrapper[4783]: I0131 09:04:55.215055 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:04:55 crc kubenswrapper[4783]: I0131 09:04:55.215198 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:55 crc kubenswrapper[4783]: I0131 09:04:55.216035 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:55 crc kubenswrapper[4783]: I0131 09:04:55.216068 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:55 crc kubenswrapper[4783]: I0131 09:04:55.216090 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:55 crc kubenswrapper[4783]: I0131 09:04:55.357069 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:04:55 crc kubenswrapper[4783]: I0131 09:04:55.608658 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 14:19:08.054949814 +0000 UTC Jan 31 09:04:55 crc kubenswrapper[4783]: I0131 09:04:55.682959 4783 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:04:55 crc kubenswrapper[4783]: I0131 09:04:55.682994 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:55 crc kubenswrapper[4783]: I0131 09:04:55.683031 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:55 crc kubenswrapper[4783]: I0131 09:04:55.683704 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:55 crc kubenswrapper[4783]: I0131 09:04:55.683734 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:55 crc kubenswrapper[4783]: I0131 09:04:55.683743 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:55 crc kubenswrapper[4783]: I0131 09:04:55.683925 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:55 crc kubenswrapper[4783]: I0131 09:04:55.683950 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:55 crc kubenswrapper[4783]: I0131 09:04:55.683978 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:56 crc kubenswrapper[4783]: I0131 09:04:56.608743 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 23:47:05.268203717 +0000 UTC Jan 31 09:04:56 crc kubenswrapper[4783]: I0131 09:04:56.856146 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:04:56 crc kubenswrapper[4783]: I0131 09:04:56.856305 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:56 crc kubenswrapper[4783]: I0131 09:04:56.857180 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:56 crc kubenswrapper[4783]: I0131 09:04:56.857211 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:56 crc kubenswrapper[4783]: I0131 09:04:56.857220 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:56 crc kubenswrapper[4783]: I0131 09:04:56.925809 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:04:56 crc kubenswrapper[4783]: I0131 09:04:56.925926 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:56 crc kubenswrapper[4783]: I0131 09:04:56.926686 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:56 crc kubenswrapper[4783]: I0131 09:04:56.926747 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:56 crc kubenswrapper[4783]: I0131 09:04:56.926762 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:57 crc kubenswrapper[4783]: I0131 09:04:57.250545 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:04:57 crc kubenswrapper[4783]: I0131 09:04:57.250675 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:57 crc kubenswrapper[4783]: I0131 09:04:57.251587 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:57 crc kubenswrapper[4783]: I0131 09:04:57.251623 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:57 crc kubenswrapper[4783]: I0131 09:04:57.251635 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:57 crc kubenswrapper[4783]: I0131 09:04:57.254705 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:04:57 crc kubenswrapper[4783]: I0131 09:04:57.609592 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 00:42:25.507028299 +0000 UTC Jan 31 09:04:57 crc kubenswrapper[4783]: I0131 09:04:57.686520 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:57 crc kubenswrapper[4783]: I0131 09:04:57.687150 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:57 crc kubenswrapper[4783]: I0131 09:04:57.687203 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:57 crc kubenswrapper[4783]: I0131 09:04:57.687212 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:58 crc kubenswrapper[4783]: I0131 09:04:58.395969 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 31 09:04:58 crc kubenswrapper[4783]: I0131 09:04:58.396099 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:04:58 crc kubenswrapper[4783]: I0131 09:04:58.396963 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:04:58 crc kubenswrapper[4783]: I0131 09:04:58.397013 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:04:58 crc kubenswrapper[4783]: I0131 09:04:58.397024 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:04:58 crc kubenswrapper[4783]: I0131 09:04:58.610278 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 15:32:26.694354486 +0000 UTC Jan 31 09:04:59 crc kubenswrapper[4783]: I0131 09:04:59.610796 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 14:57:38.799872208 +0000 UTC Jan 31 09:04:59 crc kubenswrapper[4783]: E0131 09:04:59.692419 4783 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 31 09:05:00 crc kubenswrapper[4783]: I0131 09:05:00.611801 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 19:57:13.774369256 +0000 UTC Jan 31 09:05:00 crc kubenswrapper[4783]: I0131 09:05:00.780785 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:05:00 crc kubenswrapper[4783]: I0131 09:05:00.780970 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:05:00 crc kubenswrapper[4783]: I0131 09:05:00.782047 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:00 crc kubenswrapper[4783]: I0131 09:05:00.782082 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:00 crc kubenswrapper[4783]: I0131 09:05:00.782090 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:00 crc kubenswrapper[4783]: I0131 09:05:00.784058 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:05:01 crc kubenswrapper[4783]: I0131 09:05:01.606372 4783 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 31 09:05:01 crc kubenswrapper[4783]: I0131 09:05:01.612580 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 22:19:17.633327139 +0000 UTC Jan 31 09:05:01 crc kubenswrapper[4783]: E0131 09:05:01.659106 4783 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 31 09:05:01 crc kubenswrapper[4783]: I0131 09:05:01.692819 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:05:01 crc kubenswrapper[4783]: I0131 09:05:01.693574 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:01 crc kubenswrapper[4783]: I0131 09:05:01.693623 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:01 crc kubenswrapper[4783]: I0131 09:05:01.693634 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:01 crc kubenswrapper[4783]: I0131 09:05:01.790097 4783 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Jan 31 09:05:01 crc kubenswrapper[4783]: I0131 09:05:01.790245 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 31 09:05:01 crc kubenswrapper[4783]: I0131 09:05:01.795337 4783 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Jan 31 09:05:01 crc kubenswrapper[4783]: I0131 09:05:01.795389 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 31 09:05:02 crc kubenswrapper[4783]: I0131 09:05:02.613528 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 00:30:21.583242868 +0000 UTC Jan 31 09:05:03 crc kubenswrapper[4783]: I0131 09:05:03.613986 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 17:10:07.837369632 +0000 UTC Jan 31 09:05:03 crc kubenswrapper[4783]: I0131 09:05:03.781289 4783 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 09:05:03 crc kubenswrapper[4783]: I0131 09:05:03.781333 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 31 09:05:04 crc kubenswrapper[4783]: I0131 09:05:04.614919 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 18:28:37.415535724 +0000 UTC Jan 31 09:05:04 crc kubenswrapper[4783]: I0131 09:05:04.907444 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:04 crc kubenswrapper[4783]: I0131 09:05:04.907571 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:05:04 crc kubenswrapper[4783]: I0131 09:05:04.908446 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:04 crc kubenswrapper[4783]: I0131 09:05:04.908474 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:04 crc kubenswrapper[4783]: I0131 09:05:04.908482 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:04 crc kubenswrapper[4783]: I0131 09:05:04.910686 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:05 crc kubenswrapper[4783]: I0131 09:05:05.615318 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 00:58:05.239639581 +0000 UTC Jan 31 09:05:05 crc kubenswrapper[4783]: I0131 09:05:05.699705 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:05:05 crc kubenswrapper[4783]: I0131 09:05:05.700470 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:05 crc kubenswrapper[4783]: I0131 09:05:05.700496 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:05 crc kubenswrapper[4783]: I0131 09:05:05.700504 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:05 crc kubenswrapper[4783]: I0131 09:05:05.802945 4783 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 31 09:05:05 crc kubenswrapper[4783]: I0131 09:05:05.811702 4783 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.616290 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 17:23:34.83866291 +0000 UTC Jan 31 09:05:06 crc kubenswrapper[4783]: E0131 09:05:06.784264 4783 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.786411 4783 trace.go:236] Trace[1170038388]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 09:04:53.717) (total time: 13068ms): Jan 31 09:05:06 crc kubenswrapper[4783]: Trace[1170038388]: ---"Objects listed" error: 13068ms (09:05:06.786) Jan 31 09:05:06 crc kubenswrapper[4783]: Trace[1170038388]: [13.068900216s] [13.068900216s] END Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.786444 4783 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.786479 4783 trace.go:236] Trace[856538592]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 09:04:53.899) (total time: 12887ms): Jan 31 09:05:06 crc kubenswrapper[4783]: Trace[856538592]: ---"Objects listed" error: 12887ms (09:05:06.786) Jan 31 09:05:06 crc kubenswrapper[4783]: Trace[856538592]: [12.887081599s] [12.887081599s] END Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.786499 4783 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 31 09:05:06 crc kubenswrapper[4783]: E0131 09:05:06.786677 4783 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.787438 4783 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.788708 4783 trace.go:236] Trace[1406328560]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 09:04:54.282) (total time: 12505ms): Jan 31 09:05:06 crc kubenswrapper[4783]: Trace[1406328560]: ---"Objects listed" error: 12505ms (09:05:06.788) Jan 31 09:05:06 crc kubenswrapper[4783]: Trace[1406328560]: [12.505891617s] [12.505891617s] END Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.788727 4783 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.789788 4783 trace.go:236] Trace[1522066361]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 09:04:54.128) (total time: 12661ms): Jan 31 09:05:06 crc kubenswrapper[4783]: Trace[1522066361]: ---"Objects listed" error: 12660ms (09:05:06.789) Jan 31 09:05:06 crc kubenswrapper[4783]: Trace[1522066361]: [12.661101529s] [12.661101529s] END Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.789807 4783 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.854562 4783 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56092->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.854606 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56092->192.168.126.11:17697: read: connection reset by peer" Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.854571 4783 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56102->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.854648 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56102->192.168.126.11:17697: read: connection reset by peer" Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.854893 4783 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.854929 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.926221 4783 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 31 09:05:06 crc kubenswrapper[4783]: I0131 09:05:06.926279 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.607240 4783 apiserver.go:52] "Watching apiserver" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.609592 4783 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.609745 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.610045 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.610068 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:05:07 crc kubenswrapper[4783]: E0131 09:05:07.610120 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.610235 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:07 crc kubenswrapper[4783]: E0131 09:05:07.610310 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.610342 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.610349 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.610352 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:07 crc kubenswrapper[4783]: E0131 09:05:07.610594 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.611863 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.612499 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.612601 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.613999 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.615221 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.615388 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.615496 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.615588 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.615597 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.616434 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 13:39:20.926908982 +0000 UTC Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.631387 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.638944 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.645359 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.651494 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.657124 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.663312 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.669711 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.675798 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.704992 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.706303 4783 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b" exitCode=255 Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.706369 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b"} Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.707949 4783 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.712931 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.712996 4783 scope.go:117] "RemoveContainer" containerID="4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.714781 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.722266 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.730841 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.737286 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.744051 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.749936 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.793786 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.793826 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.793846 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.793862 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.793877 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.793892 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.793910 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.793927 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.793941 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.794145 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.794562 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.794580 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.794626 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.794463 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.794637 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.794664 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.794719 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.794797 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795379 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795407 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795424 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795441 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795457 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795473 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795492 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795516 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795533 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795547 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.794880 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795175 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795195 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795237 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795322 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795450 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795970 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796214 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796228 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796236 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796246 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796260 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796310 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796403 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796328 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796472 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796628 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.795563 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796692 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796715 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796732 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796749 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796766 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796780 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796794 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796809 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796825 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796841 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796857 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796871 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796888 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796908 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796923 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796938 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796952 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796969 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796983 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.796999 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.797012 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.797027 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.797044 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.797059 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.797072 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.797086 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.797101 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.797114 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.797129 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.797143 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798016 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798036 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798061 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798077 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798094 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798110 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798125 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798140 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798154 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798184 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798199 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798214 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798228 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798245 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798557 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798587 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798605 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798620 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798749 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798775 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798791 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798813 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798830 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798845 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798862 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798878 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798894 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798909 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798924 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798944 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798959 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798973 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798989 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799003 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.797502 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.797574 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.797764 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.797954 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.797989 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798005 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798050 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798132 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798334 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798409 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798580 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798655 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798672 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798815 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.798937 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799038 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799078 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799082 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799093 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799093 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799329 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799351 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799366 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799384 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799384 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799408 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799425 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799441 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799462 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799479 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799497 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799521 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799539 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799555 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799570 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799585 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799600 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799615 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799686 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799706 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799899 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799919 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799937 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799953 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800556 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800586 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800606 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800624 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800643 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800659 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800676 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800692 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800706 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800720 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800736 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800750 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800766 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800780 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800815 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800829 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800844 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800863 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800877 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800892 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800907 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800922 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800938 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800953 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800967 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800983 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800998 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801014 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801030 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801044 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801084 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801103 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801120 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801158 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801189 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801207 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801225 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801240 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801256 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801272 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801331 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801358 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799386 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801690 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799423 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799409 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799460 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799479 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799710 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799714 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799722 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799778 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.799791 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800084 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800096 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800122 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800136 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800156 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800178 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800185 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800423 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800441 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800460 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800552 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800613 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800627 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800836 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800871 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800894 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.800959 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801188 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801298 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801307 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801341 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801346 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801367 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801400 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801587 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801665 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.801754 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.802057 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.802078 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.802367 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.802387 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.802409 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.802426 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.802671 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.802940 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.803002 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.803022 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.803241 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.803304 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.803382 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.803478 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.803666 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.803706 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.803711 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.803992 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.804005 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.804078 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.804084 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.804095 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.804718 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.804733 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.804858 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.804853 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.804953 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805079 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805097 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805113 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805128 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805143 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805171 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805186 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805202 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805218 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805234 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805248 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805262 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805277 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805291 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805305 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805319 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805334 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805352 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805366 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805380 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805395 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805410 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805424 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805437 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805451 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805466 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805482 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805496 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805518 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805533 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805549 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805566 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805579 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805593 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805606 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805620 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805634 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805650 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805666 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805681 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805697 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805713 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805746 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805769 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805786 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805802 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805817 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805832 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805848 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805863 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805878 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805895 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805909 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805925 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805940 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805984 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806033 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806043 4783 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806052 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806061 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806069 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806078 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806086 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806095 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806104 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806112 4783 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806120 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806128 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806136 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806144 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806152 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806196 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806204 4783 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806214 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806223 4783 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806234 4783 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806242 4783 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806250 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806258 4783 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806267 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806274 4783 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806283 4783 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806291 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806298 4783 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806306 4783 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806314 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806322 4783 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806330 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806338 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806346 4783 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806353 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806362 4783 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806369 4783 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806379 4783 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806386 4783 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806394 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806402 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806410 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806417 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806426 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806434 4783 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806441 4783 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806449 4783 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806457 4783 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806464 4783 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806472 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806479 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806489 4783 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806497 4783 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806505 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806520 4783 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806528 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806535 4783 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806543 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806551 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806559 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806568 4783 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806576 4783 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806584 4783 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806591 4783 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806600 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806607 4783 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806615 4783 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806624 4783 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806632 4783 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806640 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806649 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806656 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806664 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806672 4783 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806680 4783 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806688 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806696 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806704 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806712 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806721 4783 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806729 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806737 4783 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806745 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806755 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806763 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806772 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806779 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806787 4783 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806795 4783 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806803 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806812 4783 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806820 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806828 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806835 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806843 4783 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806850 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806859 4783 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806868 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806876 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806884 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806891 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806898 4783 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806906 4783 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806933 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806941 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806949 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806956 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806964 4783 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.807701 4783 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.804957 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805086 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805254 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805519 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805651 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.809802 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805869 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805980 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.805999 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806258 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806303 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806544 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806693 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806850 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.806897 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.807119 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.807258 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.807391 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.807483 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.807584 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.807695 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.808011 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.808181 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.808315 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.808316 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.809933 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.808467 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.808595 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.808645 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.808775 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.808815 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.808899 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.809044 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.809051 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.809378 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.809387 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.809547 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.809560 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.809696 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.810120 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: E0131 09:05:07.810246 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:05:08.310212386 +0000 UTC m=+18.978895854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.810349 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.810557 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.810618 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.810645 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.810989 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.811520 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.811761 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.811934 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.814087 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.814290 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: E0131 09:05:07.814543 4783 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.815962 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:05:07 crc kubenswrapper[4783]: E0131 09:05:07.816157 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:08.316145689 +0000 UTC m=+18.984829157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.816183 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.814675 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.814710 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.815568 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.814221 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.816442 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.816610 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: E0131 09:05:07.817038 4783 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:05:07 crc kubenswrapper[4783]: E0131 09:05:07.817094 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:08.317078888 +0000 UTC m=+18.985762356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.817038 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.814628 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.817448 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.817582 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.817776 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.817948 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.818186 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.818706 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.819189 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.819222 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.819543 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.819573 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.819815 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.819948 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.820256 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.820281 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: E0131 09:05:07.820300 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:05:07 crc kubenswrapper[4783]: E0131 09:05:07.820315 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:05:07 crc kubenswrapper[4783]: E0131 09:05:07.820325 4783 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:07 crc kubenswrapper[4783]: E0131 09:05:07.820366 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:08.320356755 +0000 UTC m=+18.989040224 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.820860 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.821419 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.821496 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.821900 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.822401 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.822722 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.823795 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.823951 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.824103 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.824728 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.825548 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.825611 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: E0131 09:05:07.827443 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:05:07 crc kubenswrapper[4783]: E0131 09:05:07.827536 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:05:07 crc kubenswrapper[4783]: E0131 09:05:07.827603 4783 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:07 crc kubenswrapper[4783]: E0131 09:05:07.827685 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:08.32767551 +0000 UTC m=+18.996358978 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.831179 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.832261 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.832535 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.833364 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.833948 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.842875 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.846829 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.851883 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907587 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907619 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907663 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907673 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907682 4783 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907691 4783 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907707 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907715 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907724 4783 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907732 4783 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907740 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907744 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907748 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907805 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907818 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907829 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907850 4783 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907861 4783 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907871 4783 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907880 4783 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907889 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907899 4783 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907910 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907919 4783 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907928 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907937 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907945 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907959 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907968 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907978 4783 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907987 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.907997 4783 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908006 4783 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908016 4783 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908027 4783 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908037 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908047 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908056 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908065 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908074 4783 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908083 4783 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908090 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908097 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908108 4783 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908115 4783 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908125 4783 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908132 4783 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908139 4783 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908146 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908153 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908181 4783 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908190 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908197 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908204 4783 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908212 4783 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908218 4783 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908226 4783 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908234 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908241 4783 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908248 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908255 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908262 4783 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908270 4783 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908282 4783 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908284 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908289 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908353 4783 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908363 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908372 4783 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908380 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908389 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908397 4783 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908429 4783 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908442 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908450 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908458 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908467 4783 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908475 4783 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908483 4783 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908507 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908524 4783 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908531 4783 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908543 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908550 4783 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908558 4783 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908565 4783 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908591 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908599 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908611 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.908624 4783 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.923774 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.929742 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:05:07 crc kubenswrapper[4783]: I0131 09:05:07.934477 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:05:07 crc kubenswrapper[4783]: W0131 09:05:07.946456 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-ec440ea575749f421c5d84524fbb0789ee7127317f0902ca272a0d450477a867 WatchSource:0}: Error finding container ec440ea575749f421c5d84524fbb0789ee7127317f0902ca272a0d450477a867: Status 404 returned error can't find the container with id ec440ea575749f421c5d84524fbb0789ee7127317f0902ca272a0d450477a867 Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.310814 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:05:08 crc kubenswrapper[4783]: E0131 09:05:08.310966 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:05:09.310926781 +0000 UTC m=+19.979610249 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.411514 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.411549 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.411574 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.411589 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:08 crc kubenswrapper[4783]: E0131 09:05:08.411626 4783 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:05:08 crc kubenswrapper[4783]: E0131 09:05:08.411678 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:09.411664569 +0000 UTC m=+20.080348037 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:05:08 crc kubenswrapper[4783]: E0131 09:05:08.411683 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:05:08 crc kubenswrapper[4783]: E0131 09:05:08.411698 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:05:08 crc kubenswrapper[4783]: E0131 09:05:08.411708 4783 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:08 crc kubenswrapper[4783]: E0131 09:05:08.411743 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:09.41173379 +0000 UTC m=+20.080417257 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:08 crc kubenswrapper[4783]: E0131 09:05:08.411784 4783 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:05:08 crc kubenswrapper[4783]: E0131 09:05:08.411804 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:09.411798341 +0000 UTC m=+20.080481810 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:05:08 crc kubenswrapper[4783]: E0131 09:05:08.411841 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:05:08 crc kubenswrapper[4783]: E0131 09:05:08.411851 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:05:08 crc kubenswrapper[4783]: E0131 09:05:08.411857 4783 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:08 crc kubenswrapper[4783]: E0131 09:05:08.411875 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:09.411868824 +0000 UTC m=+20.080552292 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.414718 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.424243 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.424709 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.426836 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.436223 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.444041 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.452677 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.461498 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.469339 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.477335 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.486186 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.493454 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.501140 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.513228 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.526690 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.541055 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.550867 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.561103 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.616684 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 14:10:11.964423498 +0000 UTC Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.708648 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ec440ea575749f421c5d84524fbb0789ee7127317f0902ca272a0d450477a867"} Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.709639 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd"} Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.709676 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e"} Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.709688 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ada3fb67dbfe6b8bad7bbe0953b0da4fa47a29358777f99818cf414a4080934b"} Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.710443 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01"} Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.710483 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1d3beae4d4be134790e7964e780513e94c53c773e54efb4ca5d520d7c3758d92"} Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.713231 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.715558 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f"} Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.715748 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.719328 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.733763 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.742532 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.751210 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.759496 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.768590 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.777026 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.787094 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.796147 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.803578 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.811871 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.822618 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.831331 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.844211 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.852204 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:08 crc kubenswrapper[4783]: I0131 09:05:08.864666 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:08Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.317597 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:05:09 crc kubenswrapper[4783]: E0131 09:05:09.317789 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:05:11.31776708 +0000 UTC m=+21.986450549 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.418055 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.418103 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.418123 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.418147 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:09 crc kubenswrapper[4783]: E0131 09:05:09.418233 4783 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:05:09 crc kubenswrapper[4783]: E0131 09:05:09.418258 4783 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:05:09 crc kubenswrapper[4783]: E0131 09:05:09.418273 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:05:09 crc kubenswrapper[4783]: E0131 09:05:09.418296 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:05:09 crc kubenswrapper[4783]: E0131 09:05:09.418310 4783 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:09 crc kubenswrapper[4783]: E0131 09:05:09.418278 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:11.418266401 +0000 UTC m=+22.086949868 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:05:09 crc kubenswrapper[4783]: E0131 09:05:09.418364 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:11.418347702 +0000 UTC m=+22.087031181 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:05:09 crc kubenswrapper[4783]: E0131 09:05:09.418380 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:11.418373141 +0000 UTC m=+22.087056619 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:09 crc kubenswrapper[4783]: E0131 09:05:09.418466 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:05:09 crc kubenswrapper[4783]: E0131 09:05:09.418521 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:05:09 crc kubenswrapper[4783]: E0131 09:05:09.418540 4783 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:09 crc kubenswrapper[4783]: E0131 09:05:09.418639 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:11.418612112 +0000 UTC m=+22.087295580 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.617059 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 06:28:23.184390942 +0000 UTC Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.645505 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.645652 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.645749 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:09 crc kubenswrapper[4783]: E0131 09:05:09.645669 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:09 crc kubenswrapper[4783]: E0131 09:05:09.645879 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:09 crc kubenswrapper[4783]: E0131 09:05:09.645952 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.652987 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.653474 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.654554 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.655085 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.655985 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.656467 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.656991 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.657590 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.657859 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.658413 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.659244 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.659688 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.660604 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.661046 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.661508 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.662303 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.662757 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.663584 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.663915 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.664399 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.665060 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.665291 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.665716 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.666559 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.666940 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.667835 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.668238 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.668789 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.669828 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.670254 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.671051 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.671472 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.672217 4783 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.672314 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.672670 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.673739 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.674541 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.674925 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.676206 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.676760 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.677601 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.678130 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.679010 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.679424 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.680273 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.680806 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.681057 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.681656 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.682058 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.682843 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.683325 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.684255 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.684673 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.685412 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.685825 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.686607 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.687497 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.688047 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.694031 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.702632 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.710200 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.718467 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388"} Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.719110 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.727574 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.735548 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.743245 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.756061 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.763924 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.771827 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.779428 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.788327 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.987046 4783 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.988889 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.988923 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.989279 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.989571 4783 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.995871 4783 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.996264 4783 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.997632 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.997664 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.997673 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.997691 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:09 crc kubenswrapper[4783]: I0131 09:05:09.997700 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:09Z","lastTransitionTime":"2026-01-31T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:10 crc kubenswrapper[4783]: E0131 09:05:10.011691 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.014457 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.014483 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.014518 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.014530 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.014537 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:10Z","lastTransitionTime":"2026-01-31T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:10 crc kubenswrapper[4783]: E0131 09:05:10.023911 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.026491 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.026533 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.026544 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.026572 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.026582 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:10Z","lastTransitionTime":"2026-01-31T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:10 crc kubenswrapper[4783]: E0131 09:05:10.035626 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.037960 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.037994 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.038005 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.038017 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.038027 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:10Z","lastTransitionTime":"2026-01-31T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:10 crc kubenswrapper[4783]: E0131 09:05:10.048196 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.051389 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.051415 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.051424 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.051434 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.051443 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:10Z","lastTransitionTime":"2026-01-31T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:10 crc kubenswrapper[4783]: E0131 09:05:10.059732 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: E0131 09:05:10.059835 4783 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.060826 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.060849 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.060860 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.060871 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.060879 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:10Z","lastTransitionTime":"2026-01-31T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.162405 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.162435 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.162444 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.162453 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.162462 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:10Z","lastTransitionTime":"2026-01-31T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.264111 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.264141 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.264150 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.264177 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.264186 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:10Z","lastTransitionTime":"2026-01-31T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.365722 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.365753 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.365761 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.365771 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.365779 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:10Z","lastTransitionTime":"2026-01-31T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.467521 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.467555 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.467563 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.467575 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.467584 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:10Z","lastTransitionTime":"2026-01-31T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.570499 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.570550 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.570561 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.570582 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.570594 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:10Z","lastTransitionTime":"2026-01-31T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.617838 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 09:37:10.702796332 +0000 UTC Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.673052 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.673091 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.673100 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.673115 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.673125 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:10Z","lastTransitionTime":"2026-01-31T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.774537 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.774581 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.774591 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.774605 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.774614 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:10Z","lastTransitionTime":"2026-01-31T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.784430 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.786998 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.790110 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.796234 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.813362 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.821952 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.830466 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.838369 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.847475 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.855248 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.863300 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.873374 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.875956 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.876002 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.876011 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.876026 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.876035 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:10Z","lastTransitionTime":"2026-01-31T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.882179 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.891543 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.900684 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.916493 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.926034 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.934857 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.942550 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.950788 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:10Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.977918 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.977951 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.977961 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.977973 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:10 crc kubenswrapper[4783]: I0131 09:05:10.977985 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:10Z","lastTransitionTime":"2026-01-31T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.080139 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.080196 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.080213 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.080227 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.080236 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:11Z","lastTransitionTime":"2026-01-31T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.182076 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.182103 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.182113 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.182122 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.182131 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:11Z","lastTransitionTime":"2026-01-31T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.283891 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.283919 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.283930 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.283939 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.283947 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:11Z","lastTransitionTime":"2026-01-31T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.332477 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:05:11 crc kubenswrapper[4783]: E0131 09:05:11.332650 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:05:15.332626773 +0000 UTC m=+26.001310251 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.385501 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.385541 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.385550 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.385561 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.385569 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:11Z","lastTransitionTime":"2026-01-31T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.433010 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.433039 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.433061 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.433079 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:11 crc kubenswrapper[4783]: E0131 09:05:11.433144 4783 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:05:11 crc kubenswrapper[4783]: E0131 09:05:11.433192 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:05:11 crc kubenswrapper[4783]: E0131 09:05:11.433211 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:05:11 crc kubenswrapper[4783]: E0131 09:05:11.433220 4783 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:11 crc kubenswrapper[4783]: E0131 09:05:11.433243 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:15.433224888 +0000 UTC m=+26.101908376 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:05:11 crc kubenswrapper[4783]: E0131 09:05:11.433258 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:05:11 crc kubenswrapper[4783]: E0131 09:05:11.433270 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:15.433257099 +0000 UTC m=+26.101940587 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:11 crc kubenswrapper[4783]: E0131 09:05:11.433183 4783 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:05:11 crc kubenswrapper[4783]: E0131 09:05:11.433286 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:05:11 crc kubenswrapper[4783]: E0131 09:05:11.433303 4783 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:11 crc kubenswrapper[4783]: E0131 09:05:11.433328 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:15.433320308 +0000 UTC m=+26.102003796 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:05:11 crc kubenswrapper[4783]: E0131 09:05:11.433354 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:15.433336849 +0000 UTC m=+26.102020327 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.487962 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.487992 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.488003 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.488015 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.488024 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:11Z","lastTransitionTime":"2026-01-31T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.589725 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.589756 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.589764 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.589785 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.589793 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:11Z","lastTransitionTime":"2026-01-31T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.618155 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 11:43:06.995184934 +0000 UTC Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.644983 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.645050 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:11 crc kubenswrapper[4783]: E0131 09:05:11.645076 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.645088 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:11 crc kubenswrapper[4783]: E0131 09:05:11.645178 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:11 crc kubenswrapper[4783]: E0131 09:05:11.645263 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.691353 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.691383 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.691391 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.691401 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.691408 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:11Z","lastTransitionTime":"2026-01-31T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.793119 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.793151 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.793176 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.793189 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.793196 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:11Z","lastTransitionTime":"2026-01-31T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.895857 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.895898 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.895909 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.895922 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.895933 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:11Z","lastTransitionTime":"2026-01-31T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.997740 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.997788 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.997796 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.997813 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:11 crc kubenswrapper[4783]: I0131 09:05:11.997855 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:11Z","lastTransitionTime":"2026-01-31T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.099831 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.099870 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.099879 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.099894 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.099903 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:12Z","lastTransitionTime":"2026-01-31T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.202348 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.202394 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.202405 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.202423 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.202432 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:12Z","lastTransitionTime":"2026-01-31T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.235315 4783 csr.go:261] certificate signing request csr-9rrks is approved, waiting to be issued Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.251142 4783 csr.go:257] certificate signing request csr-9rrks is issued Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.304566 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.304604 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.304613 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.304627 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.304636 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:12Z","lastTransitionTime":"2026-01-31T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.406441 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.406501 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.406511 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.406563 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.406574 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:12Z","lastTransitionTime":"2026-01-31T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.508240 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.508290 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.508302 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.508319 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.508329 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:12Z","lastTransitionTime":"2026-01-31T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.610342 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.610385 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.610394 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.610411 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.610423 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:12Z","lastTransitionTime":"2026-01-31T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.618717 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 20:53:09.657303187 +0000 UTC Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.712093 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.712135 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.712145 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.712177 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.712190 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:12Z","lastTransitionTime":"2026-01-31T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.727907 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-99m9k"] Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.728216 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-6h2bb"] Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.728374 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-99m9k" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.728853 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8q8td"] Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.728962 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.729065 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.729722 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-bqnx9"] Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.729900 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.730005 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.730241 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.730257 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.730275 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.730726 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.730794 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.730850 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.730891 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.731020 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 31 09:05:12 crc kubenswrapper[4783]: W0131 09:05:12.732372 4783 reflector.go:561] object-"openshift-machine-config-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Jan 31 09:05:12 crc kubenswrapper[4783]: E0131 09:05:12.732401 4783 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.732730 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.732743 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.732927 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.732947 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.734221 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742493 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb43cc7e-a0e2-4518-b732-3410c4d4cb5b-mcd-auth-proxy-config\") pod \"machine-config-daemon-bqnx9\" (UID: \"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\") " pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742535 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/04e04066-c510-4203-90b8-3296993cb94f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742555 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-multus-cni-dir\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742570 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-run-k8s-cni-cncf-io\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742584 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hllrp\" (UniqueName: \"kubernetes.io/projected/04e04066-c510-4203-90b8-3296993cb94f-kube-api-access-hllrp\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742599 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-hostroot\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742625 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04e04066-c510-4203-90b8-3296993cb94f-cni-binary-copy\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742650 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0b5ffe9c-191a-4902-8e13-6a869f158784-cni-binary-copy\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742665 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04e04066-c510-4203-90b8-3296993cb94f-os-release\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742681 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-run-multus-certs\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742795 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-run-netns\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742819 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-var-lib-cni-bin\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742838 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ffjr\" (UniqueName: \"kubernetes.io/projected/fb43cc7e-a0e2-4518-b732-3410c4d4cb5b-kube-api-access-8ffjr\") pod \"machine-config-daemon-bqnx9\" (UID: \"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\") " pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742869 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04e04066-c510-4203-90b8-3296993cb94f-cnibin\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742884 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-system-cni-dir\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742897 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-multus-conf-dir\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742912 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds7kn\" (UniqueName: \"kubernetes.io/projected/d46d748b-9274-46b0-9954-e55aaec61853-kube-api-access-ds7kn\") pod \"node-resolver-99m9k\" (UID: \"d46d748b-9274-46b0-9954-e55aaec61853\") " pod="openshift-dns/node-resolver-99m9k" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742928 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fb43cc7e-a0e2-4518-b732-3410c4d4cb5b-rootfs\") pod \"machine-config-daemon-bqnx9\" (UID: \"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\") " pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742944 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04e04066-c510-4203-90b8-3296993cb94f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742974 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-var-lib-cni-multus\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.742994 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-var-lib-kubelet\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.743050 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0b5ffe9c-191a-4902-8e13-6a869f158784-multus-daemon-config\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.743083 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb43cc7e-a0e2-4518-b732-3410c4d4cb5b-proxy-tls\") pod \"machine-config-daemon-bqnx9\" (UID: \"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\") " pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.743115 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-cnibin\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.743135 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57xlq\" (UniqueName: \"kubernetes.io/projected/0b5ffe9c-191a-4902-8e13-6a869f158784-kube-api-access-57xlq\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.743181 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-etc-kubernetes\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.743197 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04e04066-c510-4203-90b8-3296993cb94f-system-cni-dir\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.743209 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-os-release\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.743221 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-multus-socket-dir-parent\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.743233 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d46d748b-9274-46b0-9954-e55aaec61853-hosts-file\") pod \"node-resolver-99m9k\" (UID: \"d46d748b-9274-46b0-9954-e55aaec61853\") " pod="openshift-dns/node-resolver-99m9k" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.744480 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.754589 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.764958 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.775094 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.784103 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.798830 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.809882 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.814390 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.814423 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.814433 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.814447 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.814455 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:12Z","lastTransitionTime":"2026-01-31T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.818987 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.832945 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844121 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0b5ffe9c-191a-4902-8e13-6a869f158784-multus-daemon-config\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844153 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb43cc7e-a0e2-4518-b732-3410c4d4cb5b-proxy-tls\") pod \"machine-config-daemon-bqnx9\" (UID: \"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\") " pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844191 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57xlq\" (UniqueName: \"kubernetes.io/projected/0b5ffe9c-191a-4902-8e13-6a869f158784-kube-api-access-57xlq\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844238 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-cnibin\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844256 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04e04066-c510-4203-90b8-3296993cb94f-system-cni-dir\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844446 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-cnibin\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844545 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-etc-kubernetes\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844489 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-etc-kubernetes\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844603 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04e04066-c510-4203-90b8-3296993cb94f-system-cni-dir\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844669 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-os-release\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844623 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-os-release\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844727 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-multus-socket-dir-parent\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844751 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d46d748b-9274-46b0-9954-e55aaec61853-hosts-file\") pod \"node-resolver-99m9k\" (UID: \"d46d748b-9274-46b0-9954-e55aaec61853\") " pod="openshift-dns/node-resolver-99m9k" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844779 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-multus-cni-dir\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844797 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb43cc7e-a0e2-4518-b732-3410c4d4cb5b-mcd-auth-proxy-config\") pod \"machine-config-daemon-bqnx9\" (UID: \"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\") " pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844813 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/04e04066-c510-4203-90b8-3296993cb94f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844844 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hllrp\" (UniqueName: \"kubernetes.io/projected/04e04066-c510-4203-90b8-3296993cb94f-kube-api-access-hllrp\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844864 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-run-k8s-cni-cncf-io\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844883 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-hostroot\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844898 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04e04066-c510-4203-90b8-3296993cb94f-cni-binary-copy\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844913 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04e04066-c510-4203-90b8-3296993cb94f-os-release\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844931 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0b5ffe9c-191a-4902-8e13-6a869f158784-cni-binary-copy\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844950 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-run-multus-certs\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844968 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ffjr\" (UniqueName: \"kubernetes.io/projected/fb43cc7e-a0e2-4518-b732-3410c4d4cb5b-kube-api-access-8ffjr\") pod \"machine-config-daemon-bqnx9\" (UID: \"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\") " pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844985 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-run-netns\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844992 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0b5ffe9c-191a-4902-8e13-6a869f158784-multus-daemon-config\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845027 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-var-lib-cni-bin\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845029 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-multus-socket-dir-parent\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.844996 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-multus-cni-dir\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845002 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-var-lib-cni-bin\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845098 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04e04066-c510-4203-90b8-3296993cb94f-cnibin\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845115 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-system-cni-dir\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845130 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-var-lib-kubelet\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845144 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-multus-conf-dir\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845175 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds7kn\" (UniqueName: \"kubernetes.io/projected/d46d748b-9274-46b0-9954-e55aaec61853-kube-api-access-ds7kn\") pod \"node-resolver-99m9k\" (UID: \"d46d748b-9274-46b0-9954-e55aaec61853\") " pod="openshift-dns/node-resolver-99m9k" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845190 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fb43cc7e-a0e2-4518-b732-3410c4d4cb5b-rootfs\") pod \"machine-config-daemon-bqnx9\" (UID: \"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\") " pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845213 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04e04066-c510-4203-90b8-3296993cb94f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845231 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-var-lib-cni-multus\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845278 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-var-lib-cni-multus\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845066 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d46d748b-9274-46b0-9954-e55aaec61853-hosts-file\") pod \"node-resolver-99m9k\" (UID: \"d46d748b-9274-46b0-9954-e55aaec61853\") " pod="openshift-dns/node-resolver-99m9k" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845308 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04e04066-c510-4203-90b8-3296993cb94f-cnibin\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845334 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-system-cni-dir\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845354 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-var-lib-kubelet\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845372 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-multus-conf-dir\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845559 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fb43cc7e-a0e2-4518-b732-3410c4d4cb5b-rootfs\") pod \"machine-config-daemon-bqnx9\" (UID: \"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\") " pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845568 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04e04066-c510-4203-90b8-3296993cb94f-cni-binary-copy\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845671 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-run-k8s-cni-cncf-io\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845736 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb43cc7e-a0e2-4518-b732-3410c4d4cb5b-mcd-auth-proxy-config\") pod \"machine-config-daemon-bqnx9\" (UID: \"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\") " pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845760 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04e04066-c510-4203-90b8-3296993cb94f-os-release\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845716 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-run-multus-certs\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845793 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-host-run-netns\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845797 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0b5ffe9c-191a-4902-8e13-6a869f158784-hostroot\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.845950 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/04e04066-c510-4203-90b8-3296993cb94f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.846007 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04e04066-c510-4203-90b8-3296993cb94f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.846484 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0b5ffe9c-191a-4902-8e13-6a869f158784-cni-binary-copy\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.848263 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb43cc7e-a0e2-4518-b732-3410c4d4cb5b-proxy-tls\") pod \"machine-config-daemon-bqnx9\" (UID: \"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\") " pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.856949 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.864233 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57xlq\" (UniqueName: \"kubernetes.io/projected/0b5ffe9c-191a-4902-8e13-6a869f158784-kube-api-access-57xlq\") pod \"multus-8q8td\" (UID: \"0b5ffe9c-191a-4902-8e13-6a869f158784\") " pod="openshift-multus/multus-8q8td" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.864490 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds7kn\" (UniqueName: \"kubernetes.io/projected/d46d748b-9274-46b0-9954-e55aaec61853-kube-api-access-ds7kn\") pod \"node-resolver-99m9k\" (UID: \"d46d748b-9274-46b0-9954-e55aaec61853\") " pod="openshift-dns/node-resolver-99m9k" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.871182 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hllrp\" (UniqueName: \"kubernetes.io/projected/04e04066-c510-4203-90b8-3296993cb94f-kube-api-access-hllrp\") pod \"multus-additional-cni-plugins-6h2bb\" (UID: \"04e04066-c510-4203-90b8-3296993cb94f\") " pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.872615 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.881429 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.889656 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.898207 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.907111 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.916225 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.916256 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.916265 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.916278 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.916288 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:12Z","lastTransitionTime":"2026-01-31T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.921080 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.928823 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.938308 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.946845 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.954607 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.962624 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.969362 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:12 crc kubenswrapper[4783]: I0131 09:05:12.976524 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.018769 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.018799 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.018808 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.018824 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.018833 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:13Z","lastTransitionTime":"2026-01-31T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.041650 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-99m9k" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.046986 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.055636 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8q8td" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.081098 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vr882"] Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.081812 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.084217 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.084328 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.084469 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.086082 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.086363 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.086464 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.086496 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.096611 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.105384 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.113249 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.121104 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.121130 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.121138 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.121150 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.121173 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:13Z","lastTransitionTime":"2026-01-31T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.121601 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.134410 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.142779 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147441 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-openvswitch\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147469 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-kubelet\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147484 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-log-socket\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147500 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-var-lib-openvswitch\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147525 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-run-ovn-kubernetes\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147539 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147554 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovn-node-metrics-cert\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147568 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovnkube-script-lib\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147582 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-cni-netd\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147594 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovnkube-config\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147619 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-slash\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147632 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-cni-bin\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147645 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-systemd-units\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147658 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-etc-openvswitch\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147673 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-node-log\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147702 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-ovn\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147718 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqc7s\" (UniqueName: \"kubernetes.io/projected/4b3d03a1-7611-470d-a402-4f40ce95a54f-kube-api-access-gqc7s\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147738 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-run-netns\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147759 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-env-overrides\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.147773 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-systemd\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.153424 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.166741 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.177638 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.188034 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.201474 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.211248 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.221020 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.223147 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.223197 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.223207 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.223222 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.223232 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:13Z","lastTransitionTime":"2026-01-31T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.230090 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.248869 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-kubelet\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.248926 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-log-socket\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.248948 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-var-lib-openvswitch\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.248974 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-log-socket\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.248983 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-run-ovn-kubernetes\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.248946 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-kubelet\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249003 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249023 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovn-node-metrics-cert\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249026 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-run-ovn-kubernetes\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249039 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovnkube-script-lib\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249051 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249056 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-cni-netd\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249061 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-var-lib-openvswitch\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249071 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovnkube-config\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249180 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-slash\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249207 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-cni-bin\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249223 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-systemd-units\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249238 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-etc-openvswitch\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249257 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-node-log\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249279 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-cni-netd\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249316 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-ovn\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249349 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-cni-bin\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249295 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-ovn\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249372 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-slash\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249442 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqc7s\" (UniqueName: \"kubernetes.io/projected/4b3d03a1-7611-470d-a402-4f40ce95a54f-kube-api-access-gqc7s\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249483 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-etc-openvswitch\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249491 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-node-log\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249506 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-systemd-units\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249603 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-run-netns\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249633 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-env-overrides\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249656 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-systemd\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249685 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-run-netns\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249681 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-openvswitch\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249740 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovnkube-config\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249806 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-systemd\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249815 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-openvswitch\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.249880 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovnkube-script-lib\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.250158 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-env-overrides\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.251943 4783 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-31 09:00:12 +0000 UTC, rotation deadline is 2026-11-16 20:25:58.541024451 +0000 UTC Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.252009 4783 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6947h20m45.289021411s for next certificate rotation Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.253904 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovn-node-metrics-cert\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.263328 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqc7s\" (UniqueName: \"kubernetes.io/projected/4b3d03a1-7611-470d-a402-4f40ce95a54f-kube-api-access-gqc7s\") pod \"ovnkube-node-vr882\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.325301 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.325338 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.325347 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.325364 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.325373 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:13Z","lastTransitionTime":"2026-01-31T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.410086 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.427942 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.427975 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.427987 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.428002 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.428012 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:13Z","lastTransitionTime":"2026-01-31T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:13 crc kubenswrapper[4783]: W0131 09:05:13.447466 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b3d03a1_7611_470d_a402_4f40ce95a54f.slice/crio-cc659e66822c3dc572ee10a57ba9f66056fddad388d51623c912080de0da03aa WatchSource:0}: Error finding container cc659e66822c3dc572ee10a57ba9f66056fddad388d51623c912080de0da03aa: Status 404 returned error can't find the container with id cc659e66822c3dc572ee10a57ba9f66056fddad388d51623c912080de0da03aa Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.529384 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.529425 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.529436 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.529452 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.529465 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:13Z","lastTransitionTime":"2026-01-31T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.619038 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 18:51:08.208796243 +0000 UTC Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.631622 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.631655 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.631666 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.631680 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.631688 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:13Z","lastTransitionTime":"2026-01-31T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.645113 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.645210 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:13 crc kubenswrapper[4783]: E0131 09:05:13.645236 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.645306 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:13 crc kubenswrapper[4783]: E0131 09:05:13.645370 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:13 crc kubenswrapper[4783]: E0131 09:05:13.645467 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.727791 4783 generic.go:334] "Generic (PLEG): container finished" podID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerID="a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29" exitCode=0 Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.727867 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerDied","Data":"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.728009 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerStarted","Data":"cc659e66822c3dc572ee10a57ba9f66056fddad388d51623c912080de0da03aa"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.729279 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8q8td" event={"ID":"0b5ffe9c-191a-4902-8e13-6a869f158784","Type":"ContainerStarted","Data":"11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.729310 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8q8td" event={"ID":"0b5ffe9c-191a-4902-8e13-6a869f158784","Type":"ContainerStarted","Data":"d4f7ec2a586a17bee480770900bbedf80a45638fac75aacb060fbee4c1116bdf"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.731616 4783 generic.go:334] "Generic (PLEG): container finished" podID="04e04066-c510-4203-90b8-3296993cb94f" containerID="21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c" exitCode=0 Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.731719 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" event={"ID":"04e04066-c510-4203-90b8-3296993cb94f","Type":"ContainerDied","Data":"21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.731808 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" event={"ID":"04e04066-c510-4203-90b8-3296993cb94f","Type":"ContainerStarted","Data":"ec7ea49136b9790d0367f5f0f75fbc9f595c27e7d3e4af95519710c0e7ecb056"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.733062 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.733090 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.733099 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.733112 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.733111 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-99m9k" event={"ID":"d46d748b-9274-46b0-9954-e55aaec61853","Type":"ContainerStarted","Data":"4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.733122 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:13Z","lastTransitionTime":"2026-01-31T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.733138 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-99m9k" event={"ID":"d46d748b-9274-46b0-9954-e55aaec61853","Type":"ContainerStarted","Data":"9bc71ea17bb70ce2dc737a671e75bf605a08fa7e6bed3089553d974272527d03"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.741740 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.752951 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.764271 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.775047 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.784342 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.799038 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.808129 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.818369 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.827860 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.835151 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.835225 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.835238 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.835255 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.835268 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:13Z","lastTransitionTime":"2026-01-31T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.836095 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.844977 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.852878 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.860780 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: E0131 09:05:13.861044 4783 projected.go:288] Couldn't get configMap openshift-machine-config-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 31 09:05:13 crc kubenswrapper[4783]: E0131 09:05:13.861109 4783 projected.go:194] Error preparing data for projected volume kube-api-access-8ffjr for pod openshift-machine-config-operator/machine-config-daemon-bqnx9: failed to sync configmap cache: timed out waiting for the condition Jan 31 09:05:13 crc kubenswrapper[4783]: E0131 09:05:13.861176 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb43cc7e-a0e2-4518-b732-3410c4d4cb5b-kube-api-access-8ffjr podName:fb43cc7e-a0e2-4518-b732-3410c4d4cb5b nodeName:}" failed. No retries permitted until 2026-01-31 09:05:14.361145318 +0000 UTC m=+25.029828785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8ffjr" (UniqueName: "kubernetes.io/projected/fb43cc7e-a0e2-4518-b732-3410c4d4cb5b-kube-api-access-8ffjr") pod "machine-config-daemon-bqnx9" (UID: "fb43cc7e-a0e2-4518-b732-3410c4d4cb5b") : failed to sync configmap cache: timed out waiting for the condition Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.874496 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.890234 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.904084 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.918116 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.919510 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.926763 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.934977 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.937699 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.937734 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.937745 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.937762 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.937773 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:13Z","lastTransitionTime":"2026-01-31T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.944234 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.951463 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.959245 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.973143 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.983105 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:13 crc kubenswrapper[4783]: I0131 09:05:13.992089 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:13Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.003456 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.015811 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.030122 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.040290 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.040317 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.040327 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.040342 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.040352 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:14Z","lastTransitionTime":"2026-01-31T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.142262 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.142678 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.142693 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.142717 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.142732 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:14Z","lastTransitionTime":"2026-01-31T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.245900 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.245946 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.245958 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.245975 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.245988 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:14Z","lastTransitionTime":"2026-01-31T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.348024 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.348070 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.348082 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.348100 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.348112 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:14Z","lastTransitionTime":"2026-01-31T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.449803 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.449845 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.449856 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.449873 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.449887 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:14Z","lastTransitionTime":"2026-01-31T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.460159 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ffjr\" (UniqueName: \"kubernetes.io/projected/fb43cc7e-a0e2-4518-b732-3410c4d4cb5b-kube-api-access-8ffjr\") pod \"machine-config-daemon-bqnx9\" (UID: \"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\") " pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.463508 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ffjr\" (UniqueName: \"kubernetes.io/projected/fb43cc7e-a0e2-4518-b732-3410c4d4cb5b-kube-api-access-8ffjr\") pod \"machine-config-daemon-bqnx9\" (UID: \"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\") " pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.551312 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.551343 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.551352 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.551369 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.551378 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:14Z","lastTransitionTime":"2026-01-31T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.560569 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:05:14 crc kubenswrapper[4783]: W0131 09:05:14.570560 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb43cc7e_a0e2_4518_b732_3410c4d4cb5b.slice/crio-ca09dd6c1eb270a3c1003443f2eff5868af484b1891995364f0eff284937b47f WatchSource:0}: Error finding container ca09dd6c1eb270a3c1003443f2eff5868af484b1891995364f0eff284937b47f: Status 404 returned error can't find the container with id ca09dd6c1eb270a3c1003443f2eff5868af484b1891995364f0eff284937b47f Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.619479 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 15:24:47.401738599 +0000 UTC Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.653375 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.653405 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.653415 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.653427 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.653437 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:14Z","lastTransitionTime":"2026-01-31T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.738971 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerStarted","Data":"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.739034 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerStarted","Data":"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.739050 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerStarted","Data":"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.739062 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerStarted","Data":"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.739075 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerStarted","Data":"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.739087 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerStarted","Data":"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.740469 4783 generic.go:334] "Generic (PLEG): container finished" podID="04e04066-c510-4203-90b8-3296993cb94f" containerID="1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae" exitCode=0 Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.740545 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" event={"ID":"04e04066-c510-4203-90b8-3296993cb94f","Type":"ContainerDied","Data":"1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.742224 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerStarted","Data":"d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.742257 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerStarted","Data":"e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.742273 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerStarted","Data":"ca09dd6c1eb270a3c1003443f2eff5868af484b1891995364f0eff284937b47f"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.750568 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.755675 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.755714 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.755725 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.755742 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.755755 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:14Z","lastTransitionTime":"2026-01-31T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.758824 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.766592 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.774639 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.787987 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.796613 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.805793 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.813596 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.822594 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.831053 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.844261 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.852816 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.857605 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.857638 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.857648 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.857663 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.857673 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:14Z","lastTransitionTime":"2026-01-31T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.862327 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.871980 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.885455 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.896655 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.910001 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.918250 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.927697 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.935989 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.943347 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.952610 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.959842 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.959924 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.959957 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.959968 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.959982 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.959994 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:14Z","lastTransitionTime":"2026-01-31T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.967006 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.979589 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.987574 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:14 crc kubenswrapper[4783]: I0131 09:05:14.997140 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:14Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.015937 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.061933 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.061964 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.061974 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.061988 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.061997 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:15Z","lastTransitionTime":"2026-01-31T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.164474 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.164727 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.164740 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.164755 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.164765 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:15Z","lastTransitionTime":"2026-01-31T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.268025 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.268066 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.268077 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.268096 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.268108 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:15Z","lastTransitionTime":"2026-01-31T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.368009 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:05:15 crc kubenswrapper[4783]: E0131 09:05:15.368246 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:05:23.368223454 +0000 UTC m=+34.036906922 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.369884 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.369924 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.369936 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.369955 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.369967 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:15Z","lastTransitionTime":"2026-01-31T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.468713 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.468758 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.468812 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.468837 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:15 crc kubenswrapper[4783]: E0131 09:05:15.469025 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:05:15 crc kubenswrapper[4783]: E0131 09:05:15.469043 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:05:15 crc kubenswrapper[4783]: E0131 09:05:15.469055 4783 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:15 crc kubenswrapper[4783]: E0131 09:05:15.469104 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:23.469089856 +0000 UTC m=+34.137773323 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:15 crc kubenswrapper[4783]: E0131 09:05:15.469380 4783 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:05:15 crc kubenswrapper[4783]: E0131 09:05:15.469419 4783 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:05:15 crc kubenswrapper[4783]: E0131 09:05:15.469433 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:05:15 crc kubenswrapper[4783]: E0131 09:05:15.469469 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:05:15 crc kubenswrapper[4783]: E0131 09:05:15.469485 4783 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:15 crc kubenswrapper[4783]: E0131 09:05:15.469469 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:23.469450926 +0000 UTC m=+34.138134394 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:05:15 crc kubenswrapper[4783]: E0131 09:05:15.469578 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:23.4695589 +0000 UTC m=+34.138242378 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:05:15 crc kubenswrapper[4783]: E0131 09:05:15.469591 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:23.469584828 +0000 UTC m=+34.138268306 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.471974 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.472000 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.472012 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.472025 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.472036 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:15Z","lastTransitionTime":"2026-01-31T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.575199 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.575231 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.575240 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.575254 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.575265 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:15Z","lastTransitionTime":"2026-01-31T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.619855 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 23:07:27.884584622 +0000 UTC Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.619909 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-mwdww"] Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.620503 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mwdww" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.622249 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.622271 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.622663 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.622763 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.637390 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.645332 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.645337 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:15 crc kubenswrapper[4783]: E0131 09:05:15.645436 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.645336 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:15 crc kubenswrapper[4783]: E0131 09:05:15.645539 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:15 crc kubenswrapper[4783]: E0131 09:05:15.645629 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.647305 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.658600 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.667859 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.670368 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9fc9cc5f-1f3b-46b6-bf0c-b558160f9299-host\") pod \"node-ca-mwdww\" (UID: \"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\") " pod="openshift-image-registry/node-ca-mwdww" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.670432 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhcln\" (UniqueName: \"kubernetes.io/projected/9fc9cc5f-1f3b-46b6-bf0c-b558160f9299-kube-api-access-dhcln\") pod \"node-ca-mwdww\" (UID: \"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\") " pod="openshift-image-registry/node-ca-mwdww" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.670457 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9fc9cc5f-1f3b-46b6-bf0c-b558160f9299-serviceca\") pod \"node-ca-mwdww\" (UID: \"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\") " pod="openshift-image-registry/node-ca-mwdww" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.676896 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.677340 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.677389 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.677401 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.677416 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.677427 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:15Z","lastTransitionTime":"2026-01-31T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.685918 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.693957 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.703916 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.718641 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.726709 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.736332 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.747205 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.748325 4783 generic.go:334] "Generic (PLEG): container finished" podID="04e04066-c510-4203-90b8-3296993cb94f" containerID="a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f" exitCode=0 Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.748395 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" event={"ID":"04e04066-c510-4203-90b8-3296993cb94f","Type":"ContainerDied","Data":"a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f"} Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.760354 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.771095 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9fc9cc5f-1f3b-46b6-bf0c-b558160f9299-host\") pod \"node-ca-mwdww\" (UID: \"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\") " pod="openshift-image-registry/node-ca-mwdww" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.771182 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhcln\" (UniqueName: \"kubernetes.io/projected/9fc9cc5f-1f3b-46b6-bf0c-b558160f9299-kube-api-access-dhcln\") pod \"node-ca-mwdww\" (UID: \"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\") " pod="openshift-image-registry/node-ca-mwdww" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.771203 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9fc9cc5f-1f3b-46b6-bf0c-b558160f9299-serviceca\") pod \"node-ca-mwdww\" (UID: \"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\") " pod="openshift-image-registry/node-ca-mwdww" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.771883 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9fc9cc5f-1f3b-46b6-bf0c-b558160f9299-host\") pod \"node-ca-mwdww\" (UID: \"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\") " pod="openshift-image-registry/node-ca-mwdww" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.772530 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9fc9cc5f-1f3b-46b6-bf0c-b558160f9299-serviceca\") pod \"node-ca-mwdww\" (UID: \"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\") " pod="openshift-image-registry/node-ca-mwdww" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.772795 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.780449 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.780555 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.780613 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.780683 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.780762 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:15Z","lastTransitionTime":"2026-01-31T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.783178 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.789258 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhcln\" (UniqueName: \"kubernetes.io/projected/9fc9cc5f-1f3b-46b6-bf0c-b558160f9299-kube-api-access-dhcln\") pod \"node-ca-mwdww\" (UID: \"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\") " pod="openshift-image-registry/node-ca-mwdww" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.793209 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.802561 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.843502 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.877468 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.883651 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.883685 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.883697 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.883713 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.883724 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:15Z","lastTransitionTime":"2026-01-31T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.917078 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.929540 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mwdww" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.957079 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:15 crc kubenswrapper[4783]: W0131 09:05:15.980808 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fc9cc5f_1f3b_46b6_bf0c_b558160f9299.slice/crio-255d500f7eb4647b4711e2580c60040dbbf560e417a87af23f6e0bafe4af4bfb WatchSource:0}: Error finding container 255d500f7eb4647b4711e2580c60040dbbf560e417a87af23f6e0bafe4af4bfb: Status 404 returned error can't find the container with id 255d500f7eb4647b4711e2580c60040dbbf560e417a87af23f6e0bafe4af4bfb Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.985600 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.985631 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.985642 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.985659 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.985670 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:15Z","lastTransitionTime":"2026-01-31T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:15 crc kubenswrapper[4783]: I0131 09:05:15.996718 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.037463 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.074482 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.087802 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.087834 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.087847 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.087863 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.087875 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:16Z","lastTransitionTime":"2026-01-31T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.116226 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.159941 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.189593 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.189629 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.189640 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.189655 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.189665 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:16Z","lastTransitionTime":"2026-01-31T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.196293 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.237205 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.278195 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.291288 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.291321 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.291330 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.291345 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.291354 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:16Z","lastTransitionTime":"2026-01-31T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.316309 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.393680 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.393715 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.393725 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.393739 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.393750 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:16Z","lastTransitionTime":"2026-01-31T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.495472 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.495798 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.495810 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.495823 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.495834 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:16Z","lastTransitionTime":"2026-01-31T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.597827 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.597860 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.597871 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.597882 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.597890 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:16Z","lastTransitionTime":"2026-01-31T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.620439 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 05:44:01.391033243 +0000 UTC Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.700043 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.700076 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.700088 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.700100 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.700108 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:16Z","lastTransitionTime":"2026-01-31T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.755907 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerStarted","Data":"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb"} Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.757127 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mwdww" event={"ID":"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299","Type":"ContainerStarted","Data":"1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565"} Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.757154 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mwdww" event={"ID":"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299","Type":"ContainerStarted","Data":"255d500f7eb4647b4711e2580c60040dbbf560e417a87af23f6e0bafe4af4bfb"} Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.759514 4783 generic.go:334] "Generic (PLEG): container finished" podID="04e04066-c510-4203-90b8-3296993cb94f" containerID="b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e" exitCode=0 Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.759560 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" event={"ID":"04e04066-c510-4203-90b8-3296993cb94f","Type":"ContainerDied","Data":"b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e"} Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.767743 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.781739 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.791822 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.802775 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.802806 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.802815 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.802830 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.802840 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:16Z","lastTransitionTime":"2026-01-31T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.804802 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.814096 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.824120 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.832316 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.845618 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.854997 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.862028 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.869673 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.881938 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.888794 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.896473 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.904726 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.904758 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.904768 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.904783 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.904793 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:16Z","lastTransitionTime":"2026-01-31T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.916225 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.963927 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:16 crc kubenswrapper[4783]: I0131 09:05:16.997475 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.007223 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.007254 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.007263 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.007279 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.007290 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:17Z","lastTransitionTime":"2026-01-31T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.038006 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.077187 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.110929 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.111228 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.111240 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.111255 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.111265 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:17Z","lastTransitionTime":"2026-01-31T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.120117 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.156819 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.194107 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.214122 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.214181 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.214193 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.214217 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.214234 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:17Z","lastTransitionTime":"2026-01-31T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.236072 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.279062 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.315345 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.316845 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.316888 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.316900 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.316918 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.316935 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:17Z","lastTransitionTime":"2026-01-31T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.356975 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.397472 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.419735 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.419770 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.419779 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.419793 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.419803 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:17Z","lastTransitionTime":"2026-01-31T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.436973 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.477202 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.516539 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.522882 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.522930 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.522942 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.522968 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.522981 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:17Z","lastTransitionTime":"2026-01-31T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.621545 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 17:51:07.939680282 +0000 UTC Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.632263 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.632293 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.632303 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.632317 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.632330 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:17Z","lastTransitionTime":"2026-01-31T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.644726 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.644818 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:17 crc kubenswrapper[4783]: E0131 09:05:17.644950 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.645080 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:17 crc kubenswrapper[4783]: E0131 09:05:17.645219 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:17 crc kubenswrapper[4783]: E0131 09:05:17.645078 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.734566 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.734593 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.734602 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.734613 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.734621 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:17Z","lastTransitionTime":"2026-01-31T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.765283 4783 generic.go:334] "Generic (PLEG): container finished" podID="04e04066-c510-4203-90b8-3296993cb94f" containerID="ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae" exitCode=0 Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.765371 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" event={"ID":"04e04066-c510-4203-90b8-3296993cb94f","Type":"ContainerDied","Data":"ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae"} Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.776797 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.786238 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.804319 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.814962 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.826599 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.837143 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.837219 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.837234 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.837256 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.837269 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:17Z","lastTransitionTime":"2026-01-31T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.837474 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.845715 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.854533 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.874650 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.915428 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.939835 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.939879 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.939892 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.939905 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.939920 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:17Z","lastTransitionTime":"2026-01-31T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.959422 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:17 crc kubenswrapper[4783]: I0131 09:05:17.995731 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.036506 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.041733 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.041775 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.041785 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.041802 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.041816 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:18Z","lastTransitionTime":"2026-01-31T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.076966 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.117088 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.143805 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.143849 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.143862 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.143876 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.143904 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:18Z","lastTransitionTime":"2026-01-31T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.246190 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.246233 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.246244 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.246259 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.246272 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:18Z","lastTransitionTime":"2026-01-31T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.348255 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.348291 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.348302 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.348318 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.348332 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:18Z","lastTransitionTime":"2026-01-31T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.449969 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.450012 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.450024 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.450040 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.450052 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:18Z","lastTransitionTime":"2026-01-31T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.551977 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.552015 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.552026 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.552046 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.552058 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:18Z","lastTransitionTime":"2026-01-31T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.621828 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 07:36:52.520937239 +0000 UTC Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.654346 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.654377 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.654385 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.654397 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.654405 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:18Z","lastTransitionTime":"2026-01-31T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.756560 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.756592 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.756599 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.756613 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.756624 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:18Z","lastTransitionTime":"2026-01-31T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.769911 4783 generic.go:334] "Generic (PLEG): container finished" podID="04e04066-c510-4203-90b8-3296993cb94f" containerID="4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20" exitCode=0 Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.769976 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" event={"ID":"04e04066-c510-4203-90b8-3296993cb94f","Type":"ContainerDied","Data":"4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20"} Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.779789 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerStarted","Data":"ee24c843eea937714143b20d6fdabbdaa446a697fc59fe5835eb3a42bcc6c954"} Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.780069 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.780100 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.790382 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.800516 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.802742 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.802852 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.811872 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.821692 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.829351 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.837324 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.845010 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.852723 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.858209 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.858241 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.858252 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.858266 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.858277 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:18Z","lastTransitionTime":"2026-01-31T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.868263 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.876472 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.885039 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.896177 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.905234 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.913858 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.922494 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.931956 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.942952 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.950852 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.960077 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.960108 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.960117 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.960132 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.960144 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:18Z","lastTransitionTime":"2026-01-31T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.960915 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.970068 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.980039 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:18 crc kubenswrapper[4783]: I0131 09:05:18.997878 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:18Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.040337 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.062652 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.062684 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.062694 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.062709 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.062721 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:19Z","lastTransitionTime":"2026-01-31T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.076504 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.114687 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.155282 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.164626 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.164654 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.164665 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.164679 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.164688 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:19Z","lastTransitionTime":"2026-01-31T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.200964 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee24c843eea937714143b20d6fdabbdaa446a697fc59fe5835eb3a42bcc6c954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.235284 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.266963 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.266995 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.267005 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.267020 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.267029 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:19Z","lastTransitionTime":"2026-01-31T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.275357 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.315786 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.369252 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.369287 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.369298 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.369314 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.369324 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:19Z","lastTransitionTime":"2026-01-31T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.471709 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.471743 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.471753 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.471768 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.471778 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:19Z","lastTransitionTime":"2026-01-31T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.536319 4783 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.573706 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.573740 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.573751 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.573766 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.573775 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:19Z","lastTransitionTime":"2026-01-31T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.622216 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 21:47:50.053632548 +0000 UTC Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.645637 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:19 crc kubenswrapper[4783]: E0131 09:05:19.645738 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.645986 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:19 crc kubenswrapper[4783]: E0131 09:05:19.646040 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.646176 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:19 crc kubenswrapper[4783]: E0131 09:05:19.646234 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.658510 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.667296 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.676064 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.676109 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.676128 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.676146 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.676155 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:19Z","lastTransitionTime":"2026-01-31T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.676944 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.692001 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.705552 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.716437 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.725118 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.737654 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee24c843eea937714143b20d6fdabbdaa446a697fc59fe5835eb3a42bcc6c954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.744768 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.758122 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.767382 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.778512 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.778627 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.778688 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.778749 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.778807 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:19Z","lastTransitionTime":"2026-01-31T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.787373 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" event={"ID":"04e04066-c510-4203-90b8-3296993cb94f","Type":"ContainerStarted","Data":"c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2"} Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.787440 4783 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.794448 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.836184 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.878471 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.880658 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.880689 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.880699 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.880714 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.880724 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:19Z","lastTransitionTime":"2026-01-31T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.918020 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.956416 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.983113 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.983149 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.983157 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.983184 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.983194 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:19Z","lastTransitionTime":"2026-01-31T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:19 crc kubenswrapper[4783]: I0131 09:05:19.996774 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.041568 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee24c843eea937714143b20d6fdabbdaa446a697fc59fe5835eb3a42bcc6c954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.075208 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.085704 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.085741 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.085750 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.085765 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.085774 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:20Z","lastTransitionTime":"2026-01-31T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.116319 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.156670 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.188068 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.188115 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.188126 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.188137 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.188145 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:20Z","lastTransitionTime":"2026-01-31T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.197046 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.228187 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.228219 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.228228 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.228239 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.228265 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:20Z","lastTransitionTime":"2026-01-31T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.237755 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: E0131 09:05:20.237783 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.240662 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.240691 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.240699 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.240709 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.240715 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:20Z","lastTransitionTime":"2026-01-31T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:20 crc kubenswrapper[4783]: E0131 09:05:20.249298 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.251463 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.251491 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.251500 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.251513 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.251538 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:20Z","lastTransitionTime":"2026-01-31T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:20 crc kubenswrapper[4783]: E0131 09:05:20.260494 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.262498 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.262534 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.262542 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.262552 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.262559 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:20Z","lastTransitionTime":"2026-01-31T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:20 crc kubenswrapper[4783]: E0131 09:05:20.274707 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.276695 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.277385 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.277499 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.277587 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.277657 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.277730 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:20Z","lastTransitionTime":"2026-01-31T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:20 crc kubenswrapper[4783]: E0131 09:05:20.285743 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: E0131 09:05:20.285855 4783 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.290001 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.290027 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.290036 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.290047 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.290058 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:20Z","lastTransitionTime":"2026-01-31T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.316879 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.356511 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.391988 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.392013 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.392021 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.392043 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.392053 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:20Z","lastTransitionTime":"2026-01-31T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.397914 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.436642 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.482914 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.494418 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.494445 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.494454 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.494467 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.494476 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:20Z","lastTransitionTime":"2026-01-31T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.516269 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.595924 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.595952 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.595963 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.595973 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.595981 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:20Z","lastTransitionTime":"2026-01-31T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.622604 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 11:56:40.4318232 +0000 UTC Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.698267 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.698291 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.698299 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.698310 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.698319 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:20Z","lastTransitionTime":"2026-01-31T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.790971 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovnkube-controller/0.log" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.792771 4783 generic.go:334] "Generic (PLEG): container finished" podID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerID="ee24c843eea937714143b20d6fdabbdaa446a697fc59fe5835eb3a42bcc6c954" exitCode=1 Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.792799 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerDied","Data":"ee24c843eea937714143b20d6fdabbdaa446a697fc59fe5835eb3a42bcc6c954"} Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.793264 4783 scope.go:117] "RemoveContainer" containerID="ee24c843eea937714143b20d6fdabbdaa446a697fc59fe5835eb3a42bcc6c954" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.800173 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.800201 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.800210 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.800221 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.800235 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:20Z","lastTransitionTime":"2026-01-31T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.803963 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.813090 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.826591 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.834375 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.843501 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.852224 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.859728 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.867725 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.874685 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.901735 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.901758 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.901769 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.901781 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.901789 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:20Z","lastTransitionTime":"2026-01-31T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.915001 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.959989 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee24c843eea937714143b20d6fdabbdaa446a697fc59fe5835eb3a42bcc6c954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee24c843eea937714143b20d6fdabbdaa446a697fc59fe5835eb3a42bcc6c954\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"andler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 09:05:20.468321 6063 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:05:20.468394 6063 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 09:05:20.468218 6063 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 09:05:20.468509 6063 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:05:20.468945 6063 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:05:20.468998 6063 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:05:20.469096 6063 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:20 crc kubenswrapper[4783]: I0131 09:05:20.995784 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:20Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.003674 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.003702 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.003711 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.003727 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.003736 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:21Z","lastTransitionTime":"2026-01-31T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.037605 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.077097 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.105888 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.105919 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.105928 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.105940 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.105950 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:21Z","lastTransitionTime":"2026-01-31T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.116458 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.208293 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.208321 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.208330 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.208342 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.208351 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:21Z","lastTransitionTime":"2026-01-31T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.310286 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.310315 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.310330 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.310342 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.310351 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:21Z","lastTransitionTime":"2026-01-31T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.411955 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.411988 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.411999 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.412011 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.412019 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:21Z","lastTransitionTime":"2026-01-31T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.514049 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.514098 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.514108 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.514122 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.514133 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:21Z","lastTransitionTime":"2026-01-31T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.615856 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.615887 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.615895 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.615906 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.615915 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:21Z","lastTransitionTime":"2026-01-31T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.623047 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 08:36:19.546407965 +0000 UTC Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.645486 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.645512 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.645486 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:21 crc kubenswrapper[4783]: E0131 09:05:21.645615 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:21 crc kubenswrapper[4783]: E0131 09:05:21.645648 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:21 crc kubenswrapper[4783]: E0131 09:05:21.645690 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.717776 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.717806 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.717814 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.717824 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.717833 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:21Z","lastTransitionTime":"2026-01-31T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.797414 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovnkube-controller/1.log" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.797992 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovnkube-controller/0.log" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.800435 4783 generic.go:334] "Generic (PLEG): container finished" podID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerID="0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf" exitCode=1 Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.800488 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerDied","Data":"0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf"} Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.800520 4783 scope.go:117] "RemoveContainer" containerID="ee24c843eea937714143b20d6fdabbdaa446a697fc59fe5835eb3a42bcc6c954" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.801369 4783 scope.go:117] "RemoveContainer" containerID="0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf" Jan 31 09:05:21 crc kubenswrapper[4783]: E0131 09:05:21.801558 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.810402 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.819705 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.819739 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.819749 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.819763 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.819790 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:21Z","lastTransitionTime":"2026-01-31T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.823795 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.831685 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.840789 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.848891 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.859070 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.867655 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.880566 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.888364 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.895413 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.902833 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.916943 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee24c843eea937714143b20d6fdabbdaa446a697fc59fe5835eb3a42bcc6c954\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:20Z\\\",\\\"message\\\":\\\"andler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 09:05:20.468321 6063 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:05:20.468394 6063 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 09:05:20.468218 6063 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 09:05:20.468509 6063 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:05:20.468945 6063 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:05:20.468998 6063 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:05:20.469096 6063 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:21Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:21.410784 6203 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}\\\\nI0131 09:05:21.410801 6203 services_controller.go:360] Finished syncing service certified-operators on namespace openshift-marketplace for network=default : 1.472455ms\\\\nI0131 09:05:21.411369 6203 obj_retry.go:551] Creating *factory.egressNode crc took: 1.654498ms\\\\nI0131 09:05:21.411394 6203 factory.go:1336] Added *v1.Node event handler 7\\\\nI0131 09:05:21.411419 6203 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0131 09:05:21.411610 6203 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0131 09:05:21.411679 6203 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0131 09:05:21.411706 6203 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:05:21.411729 6203 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:05:21.411776 6203 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.921586 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.921628 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.921640 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.921662 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.921675 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:21Z","lastTransitionTime":"2026-01-31T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.923826 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.931482 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:21 crc kubenswrapper[4783]: I0131 09:05:21.939500 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.024231 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.024281 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.024293 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.024309 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.024320 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:22Z","lastTransitionTime":"2026-01-31T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.127502 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.127591 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.127611 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.127633 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.127658 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:22Z","lastTransitionTime":"2026-01-31T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.230861 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.230897 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.230908 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.230939 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.230949 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:22Z","lastTransitionTime":"2026-01-31T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.332468 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.332548 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.332558 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.332572 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.332581 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:22Z","lastTransitionTime":"2026-01-31T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.434615 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.434651 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.434662 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.434674 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.434683 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:22Z","lastTransitionTime":"2026-01-31T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.536353 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.536390 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.536402 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.536415 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.536425 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:22Z","lastTransitionTime":"2026-01-31T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.624108 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 14:57:55.364829724 +0000 UTC Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.638467 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.638507 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.638520 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.638545 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.638554 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:22Z","lastTransitionTime":"2026-01-31T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.739770 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.739804 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.739814 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.739826 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.739835 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:22Z","lastTransitionTime":"2026-01-31T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.803903 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovnkube-controller/1.log" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.806420 4783 scope.go:117] "RemoveContainer" containerID="0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf" Jan 31 09:05:22 crc kubenswrapper[4783]: E0131 09:05:22.806566 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.814455 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.824071 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.833089 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.841361 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.841401 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.841432 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.841450 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.841461 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:22Z","lastTransitionTime":"2026-01-31T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.842340 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.850320 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.863284 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.870763 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.879831 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.887755 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.894696 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.901985 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.908516 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.915182 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.930276 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:21Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:21.410784 6203 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}\\\\nI0131 09:05:21.410801 6203 services_controller.go:360] Finished syncing service certified-operators on namespace openshift-marketplace for network=default : 1.472455ms\\\\nI0131 09:05:21.411369 6203 obj_retry.go:551] Creating *factory.egressNode crc took: 1.654498ms\\\\nI0131 09:05:21.411394 6203 factory.go:1336] Added *v1.Node event handler 7\\\\nI0131 09:05:21.411419 6203 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0131 09:05:21.411610 6203 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0131 09:05:21.411679 6203 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0131 09:05:21.411706 6203 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:05:21.411729 6203 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:05:21.411776 6203 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.937149 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.942987 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.943020 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.943030 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.943045 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:22 crc kubenswrapper[4783]: I0131 09:05:22.943055 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:22Z","lastTransitionTime":"2026-01-31T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.044550 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.044616 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.044629 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.044662 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.044675 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:23Z","lastTransitionTime":"2026-01-31T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.146855 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.146914 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.146929 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.146946 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.146957 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:23Z","lastTransitionTime":"2026-01-31T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.249482 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.249517 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.249537 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.249555 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.249567 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:23Z","lastTransitionTime":"2026-01-31T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.351491 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.351520 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.351539 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.351550 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.351559 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:23Z","lastTransitionTime":"2026-01-31T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.445722 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:05:23 crc kubenswrapper[4783]: E0131 09:05:23.445892 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:05:39.445867276 +0000 UTC m=+50.114550744 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.452967 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.452998 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.453008 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.453020 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.453030 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:23Z","lastTransitionTime":"2026-01-31T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.546238 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.546274 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.546302 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.546323 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:23 crc kubenswrapper[4783]: E0131 09:05:23.546369 4783 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:05:23 crc kubenswrapper[4783]: E0131 09:05:23.546398 4783 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:05:23 crc kubenswrapper[4783]: E0131 09:05:23.546415 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:05:23 crc kubenswrapper[4783]: E0131 09:05:23.546430 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:05:23 crc kubenswrapper[4783]: E0131 09:05:23.546438 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:39.546420586 +0000 UTC m=+50.215104064 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:05:23 crc kubenswrapper[4783]: E0131 09:05:23.546442 4783 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:23 crc kubenswrapper[4783]: E0131 09:05:23.546452 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:05:23 crc kubenswrapper[4783]: E0131 09:05:23.546480 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:05:23 crc kubenswrapper[4783]: E0131 09:05:23.546496 4783 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:23 crc kubenswrapper[4783]: E0131 09:05:23.546459 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:39.546451044 +0000 UTC m=+50.215134512 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:05:23 crc kubenswrapper[4783]: E0131 09:05:23.546555 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:39.546539349 +0000 UTC m=+50.215222818 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:23 crc kubenswrapper[4783]: E0131 09:05:23.546587 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:39.546563025 +0000 UTC m=+50.215246493 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.554429 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.554460 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.554481 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.554492 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.554501 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:23Z","lastTransitionTime":"2026-01-31T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.625238 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 15:11:47.807884 +0000 UTC Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.645562 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.645589 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.645570 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:23 crc kubenswrapper[4783]: E0131 09:05:23.645679 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:23 crc kubenswrapper[4783]: E0131 09:05:23.645776 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:23 crc kubenswrapper[4783]: E0131 09:05:23.645864 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.656554 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.656588 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.656599 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.656615 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.656627 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:23Z","lastTransitionTime":"2026-01-31T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.758318 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.758343 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.758350 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.758361 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.758369 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:23Z","lastTransitionTime":"2026-01-31T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.860376 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.860407 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.860415 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.860428 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.860437 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:23Z","lastTransitionTime":"2026-01-31T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.961953 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.961983 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.961995 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.962007 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:23 crc kubenswrapper[4783]: I0131 09:05:23.962015 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:23Z","lastTransitionTime":"2026-01-31T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.064374 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.064404 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.064413 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.064424 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.064432 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:24Z","lastTransitionTime":"2026-01-31T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.165962 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.165997 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.166006 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.166022 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.166034 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:24Z","lastTransitionTime":"2026-01-31T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.267354 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.267404 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.267413 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.267424 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.267432 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:24Z","lastTransitionTime":"2026-01-31T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.369341 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.369372 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.369379 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.369390 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.369399 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:24Z","lastTransitionTime":"2026-01-31T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.471021 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.471048 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.471055 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.471064 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.471079 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:24Z","lastTransitionTime":"2026-01-31T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.535273 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd"] Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.535677 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.537019 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.537126 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.544636 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.565809 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.572698 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.572728 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.572737 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.572751 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.572762 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:24Z","lastTransitionTime":"2026-01-31T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.572837 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.579912 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.591729 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:21Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:21.410784 6203 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}\\\\nI0131 09:05:21.410801 6203 services_controller.go:360] Finished syncing service certified-operators on namespace openshift-marketplace for network=default : 1.472455ms\\\\nI0131 09:05:21.411369 6203 obj_retry.go:551] Creating *factory.egressNode crc took: 1.654498ms\\\\nI0131 09:05:21.411394 6203 factory.go:1336] Added *v1.Node event handler 7\\\\nI0131 09:05:21.411419 6203 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0131 09:05:21.411610 6203 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0131 09:05:21.411679 6203 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0131 09:05:21.411706 6203 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:05:21.411729 6203 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:05:21.411776 6203 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.598006 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.605520 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.613726 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.621755 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.626177 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 23:56:10.941039971 +0000 UTC Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.630516 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.638197 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.651523 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.654086 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1725409a-e6c3-4770-8341-2e390ff5e44b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jslrd\" (UID: \"1725409a-e6c3-4770-8341-2e390ff5e44b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.654112 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1725409a-e6c3-4770-8341-2e390ff5e44b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jslrd\" (UID: \"1725409a-e6c3-4770-8341-2e390ff5e44b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.654131 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kqkh\" (UniqueName: \"kubernetes.io/projected/1725409a-e6c3-4770-8341-2e390ff5e44b-kube-api-access-7kqkh\") pod \"ovnkube-control-plane-749d76644c-jslrd\" (UID: \"1725409a-e6c3-4770-8341-2e390ff5e44b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.654181 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1725409a-e6c3-4770-8341-2e390ff5e44b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jslrd\" (UID: \"1725409a-e6c3-4770-8341-2e390ff5e44b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.658898 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.667866 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.674733 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.674760 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.674768 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.674779 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.674790 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:24Z","lastTransitionTime":"2026-01-31T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.675666 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.682342 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.755540 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1725409a-e6c3-4770-8341-2e390ff5e44b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jslrd\" (UID: \"1725409a-e6c3-4770-8341-2e390ff5e44b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.755578 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1725409a-e6c3-4770-8341-2e390ff5e44b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jslrd\" (UID: \"1725409a-e6c3-4770-8341-2e390ff5e44b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.755599 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kqkh\" (UniqueName: \"kubernetes.io/projected/1725409a-e6c3-4770-8341-2e390ff5e44b-kube-api-access-7kqkh\") pod \"ovnkube-control-plane-749d76644c-jslrd\" (UID: \"1725409a-e6c3-4770-8341-2e390ff5e44b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.755639 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1725409a-e6c3-4770-8341-2e390ff5e44b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jslrd\" (UID: \"1725409a-e6c3-4770-8341-2e390ff5e44b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.756220 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1725409a-e6c3-4770-8341-2e390ff5e44b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jslrd\" (UID: \"1725409a-e6c3-4770-8341-2e390ff5e44b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.756358 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1725409a-e6c3-4770-8341-2e390ff5e44b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jslrd\" (UID: \"1725409a-e6c3-4770-8341-2e390ff5e44b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.760922 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1725409a-e6c3-4770-8341-2e390ff5e44b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jslrd\" (UID: \"1725409a-e6c3-4770-8341-2e390ff5e44b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.768203 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kqkh\" (UniqueName: \"kubernetes.io/projected/1725409a-e6c3-4770-8341-2e390ff5e44b-kube-api-access-7kqkh\") pod \"ovnkube-control-plane-749d76644c-jslrd\" (UID: \"1725409a-e6c3-4770-8341-2e390ff5e44b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.776255 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.776284 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.776311 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.776325 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.776334 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:24Z","lastTransitionTime":"2026-01-31T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.845254 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" Jan 31 09:05:24 crc kubenswrapper[4783]: W0131 09:05:24.854970 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1725409a_e6c3_4770_8341_2e390ff5e44b.slice/crio-f078a305baf7b173fa1482baf657bd19338aebd3e5d3fefba82139cf071f036f WatchSource:0}: Error finding container f078a305baf7b173fa1482baf657bd19338aebd3e5d3fefba82139cf071f036f: Status 404 returned error can't find the container with id f078a305baf7b173fa1482baf657bd19338aebd3e5d3fefba82139cf071f036f Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.878321 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.878348 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.878358 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.878370 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.878379 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:24Z","lastTransitionTime":"2026-01-31T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.981509 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.981556 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.981565 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.981582 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:24 crc kubenswrapper[4783]: I0131 09:05:24.981592 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:24Z","lastTransitionTime":"2026-01-31T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.083213 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.083252 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.083262 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.083276 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.083284 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:25Z","lastTransitionTime":"2026-01-31T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.186035 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.186066 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.186074 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.186086 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.186094 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:25Z","lastTransitionTime":"2026-01-31T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.288225 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.288265 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.288274 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.288288 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.288298 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:25Z","lastTransitionTime":"2026-01-31T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.390042 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.390081 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.390091 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.390107 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.390117 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:25Z","lastTransitionTime":"2026-01-31T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.492374 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.492409 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.492419 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.492430 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.492439 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:25Z","lastTransitionTime":"2026-01-31T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.594065 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.594099 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.594108 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.594119 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.594129 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:25Z","lastTransitionTime":"2026-01-31T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.626552 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 14:03:07.250673605 +0000 UTC Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.645103 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.645143 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.645182 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:25 crc kubenswrapper[4783]: E0131 09:05:25.645235 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:25 crc kubenswrapper[4783]: E0131 09:05:25.645293 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:25 crc kubenswrapper[4783]: E0131 09:05:25.645344 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.696136 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.696179 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.696190 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.696203 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.696211 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:25Z","lastTransitionTime":"2026-01-31T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.798412 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.798447 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.798459 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.798471 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.798480 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:25Z","lastTransitionTime":"2026-01-31T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.814041 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" event={"ID":"1725409a-e6c3-4770-8341-2e390ff5e44b","Type":"ContainerStarted","Data":"762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f"} Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.814074 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" event={"ID":"1725409a-e6c3-4770-8341-2e390ff5e44b","Type":"ContainerStarted","Data":"0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc"} Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.814085 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" event={"ID":"1725409a-e6c3-4770-8341-2e390ff5e44b","Type":"ContainerStarted","Data":"f078a305baf7b173fa1482baf657bd19338aebd3e5d3fefba82139cf071f036f"} Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.824396 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.832956 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.841621 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.848492 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.857971 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.870996 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:21Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:21.410784 6203 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}\\\\nI0131 09:05:21.410801 6203 services_controller.go:360] Finished syncing service certified-operators on namespace openshift-marketplace for network=default : 1.472455ms\\\\nI0131 09:05:21.411369 6203 obj_retry.go:551] Creating *factory.egressNode crc took: 1.654498ms\\\\nI0131 09:05:21.411394 6203 factory.go:1336] Added *v1.Node event handler 7\\\\nI0131 09:05:21.411419 6203 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0131 09:05:21.411610 6203 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0131 09:05:21.411679 6203 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0131 09:05:21.411706 6203 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:05:21.411729 6203 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:05:21.411776 6203 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.878937 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.886778 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.894478 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.899634 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.899664 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.899674 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.899685 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.899694 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:25Z","lastTransitionTime":"2026-01-31T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.903026 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.911667 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.924599 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.937696 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.947568 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.956102 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:25 crc kubenswrapper[4783]: I0131 09:05:25.963141 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.001800 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.001855 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.001867 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.001890 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.001904 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:26Z","lastTransitionTime":"2026-01-31T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.103920 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.103964 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.103974 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.103987 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.103997 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:26Z","lastTransitionTime":"2026-01-31T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.205798 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.205831 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.205841 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.205859 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.205870 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:26Z","lastTransitionTime":"2026-01-31T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.307504 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.307553 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.307565 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.307581 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.307592 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:26Z","lastTransitionTime":"2026-01-31T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.337660 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-xg6x2"] Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.338245 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:26 crc kubenswrapper[4783]: E0131 09:05:26.338317 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.347910 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.355415 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.362873 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.368354 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs\") pod \"network-metrics-daemon-xg6x2\" (UID: \"84961ed7-35f8-4e6a-987c-cabb84cf7268\") " pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.368387 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-676hz\" (UniqueName: \"kubernetes.io/projected/84961ed7-35f8-4e6a-987c-cabb84cf7268-kube-api-access-676hz\") pod \"network-metrics-daemon-xg6x2\" (UID: \"84961ed7-35f8-4e6a-987c-cabb84cf7268\") " pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.375430 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:21Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:21.410784 6203 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}\\\\nI0131 09:05:21.410801 6203 services_controller.go:360] Finished syncing service certified-operators on namespace openshift-marketplace for network=default : 1.472455ms\\\\nI0131 09:05:21.411369 6203 obj_retry.go:551] Creating *factory.egressNode crc took: 1.654498ms\\\\nI0131 09:05:21.411394 6203 factory.go:1336] Added *v1.Node event handler 7\\\\nI0131 09:05:21.411419 6203 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0131 09:05:21.411610 6203 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0131 09:05:21.411679 6203 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0131 09:05:21.411706 6203 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:05:21.411729 6203 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:05:21.411776 6203 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.381974 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.388667 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.396078 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.405064 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.410073 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.410106 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.410116 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.410131 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.410141 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:26Z","lastTransitionTime":"2026-01-31T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.413853 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.422454 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.430846 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.439936 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.448024 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.458880 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.467021 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.468861 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs\") pod \"network-metrics-daemon-xg6x2\" (UID: \"84961ed7-35f8-4e6a-987c-cabb84cf7268\") " pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.468903 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-676hz\" (UniqueName: \"kubernetes.io/projected/84961ed7-35f8-4e6a-987c-cabb84cf7268-kube-api-access-676hz\") pod \"network-metrics-daemon-xg6x2\" (UID: \"84961ed7-35f8-4e6a-987c-cabb84cf7268\") " pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:26 crc kubenswrapper[4783]: E0131 09:05:26.469043 4783 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:05:26 crc kubenswrapper[4783]: E0131 09:05:26.469115 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs podName:84961ed7-35f8-4e6a-987c-cabb84cf7268 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:26.96909729 +0000 UTC m=+37.637780759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs") pod "network-metrics-daemon-xg6x2" (UID: "84961ed7-35f8-4e6a-987c-cabb84cf7268") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.475518 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.482836 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-676hz\" (UniqueName: \"kubernetes.io/projected/84961ed7-35f8-4e6a-987c-cabb84cf7268-kube-api-access-676hz\") pod \"network-metrics-daemon-xg6x2\" (UID: \"84961ed7-35f8-4e6a-987c-cabb84cf7268\") " pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.488946 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.512447 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.512477 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.512486 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.512503 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.512516 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:26Z","lastTransitionTime":"2026-01-31T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.614632 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.614662 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.614672 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.614689 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.614697 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:26Z","lastTransitionTime":"2026-01-31T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.626937 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 03:03:35.419613178 +0000 UTC Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.716822 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.716857 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.716866 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.716882 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.716891 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:26Z","lastTransitionTime":"2026-01-31T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.817951 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.817980 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.817987 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.817999 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.818010 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:26Z","lastTransitionTime":"2026-01-31T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.919257 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.919283 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.919292 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.919305 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.919315 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:26Z","lastTransitionTime":"2026-01-31T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.929689 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.943550 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:21Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:21.410784 6203 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}\\\\nI0131 09:05:21.410801 6203 services_controller.go:360] Finished syncing service certified-operators on namespace openshift-marketplace for network=default : 1.472455ms\\\\nI0131 09:05:21.411369 6203 obj_retry.go:551] Creating *factory.egressNode crc took: 1.654498ms\\\\nI0131 09:05:21.411394 6203 factory.go:1336] Added *v1.Node event handler 7\\\\nI0131 09:05:21.411419 6203 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0131 09:05:21.411610 6203 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0131 09:05:21.411679 6203 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0131 09:05:21.411706 6203 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:05:21.411729 6203 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:05:21.411776 6203 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.950225 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.956956 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.964654 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.972333 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.973502 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs\") pod \"network-metrics-daemon-xg6x2\" (UID: \"84961ed7-35f8-4e6a-987c-cabb84cf7268\") " pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:26 crc kubenswrapper[4783]: E0131 09:05:26.973689 4783 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:05:26 crc kubenswrapper[4783]: E0131 09:05:26.973736 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs podName:84961ed7-35f8-4e6a-987c-cabb84cf7268 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:27.973723999 +0000 UTC m=+38.642407467 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs") pod "network-metrics-daemon-xg6x2" (UID: "84961ed7-35f8-4e6a-987c-cabb84cf7268") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.978829 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.986368 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:26 crc kubenswrapper[4783]: I0131 09:05:26.994574 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.004650 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.012598 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.021114 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.021304 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.021342 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.021352 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.021364 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.021374 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:27Z","lastTransitionTime":"2026-01-31T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.028794 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.035560 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.047607 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.055085 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.064044 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.071950 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.123754 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.123785 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.123796 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.123810 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.123821 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:27Z","lastTransitionTime":"2026-01-31T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.226647 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.226682 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.226692 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.226705 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.226717 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:27Z","lastTransitionTime":"2026-01-31T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.328889 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.328915 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.328927 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.328937 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.328963 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:27Z","lastTransitionTime":"2026-01-31T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.430272 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.430300 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.430308 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.430318 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.430345 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:27Z","lastTransitionTime":"2026-01-31T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.532033 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.532438 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.532577 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.532653 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.532712 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:27Z","lastTransitionTime":"2026-01-31T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.627991 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 18:32:35.975441384 +0000 UTC Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.634157 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.634197 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.634206 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.634222 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.634231 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:27Z","lastTransitionTime":"2026-01-31T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.645649 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:27 crc kubenswrapper[4783]: E0131 09:05:27.645780 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.645801 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.645856 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:27 crc kubenswrapper[4783]: E0131 09:05:27.645936 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:27 crc kubenswrapper[4783]: E0131 09:05:27.646004 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.646075 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:27 crc kubenswrapper[4783]: E0131 09:05:27.646139 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.735658 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.735687 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.735695 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.735708 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.735718 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:27Z","lastTransitionTime":"2026-01-31T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.837000 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.837031 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.837040 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.837053 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.837079 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:27Z","lastTransitionTime":"2026-01-31T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.938663 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.938690 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.938697 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.938706 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.938714 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:27Z","lastTransitionTime":"2026-01-31T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:27 crc kubenswrapper[4783]: I0131 09:05:27.982948 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs\") pod \"network-metrics-daemon-xg6x2\" (UID: \"84961ed7-35f8-4e6a-987c-cabb84cf7268\") " pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:27 crc kubenswrapper[4783]: E0131 09:05:27.983085 4783 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:05:27 crc kubenswrapper[4783]: E0131 09:05:27.983141 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs podName:84961ed7-35f8-4e6a-987c-cabb84cf7268 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:29.983126338 +0000 UTC m=+40.651809796 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs") pod "network-metrics-daemon-xg6x2" (UID: "84961ed7-35f8-4e6a-987c-cabb84cf7268") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.040646 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.040678 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.040686 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.040698 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.040722 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:28Z","lastTransitionTime":"2026-01-31T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.142872 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.143059 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.143067 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.143081 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.143089 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:28Z","lastTransitionTime":"2026-01-31T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.245492 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.245567 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.245578 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.245590 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.245601 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:28Z","lastTransitionTime":"2026-01-31T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.347113 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.347141 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.347150 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.347184 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.347200 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:28Z","lastTransitionTime":"2026-01-31T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.449032 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.449077 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.449086 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.449096 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.449105 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:28Z","lastTransitionTime":"2026-01-31T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.550687 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.550714 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.550722 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.550737 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.550746 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:28Z","lastTransitionTime":"2026-01-31T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.628913 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 22:57:19.702027244 +0000 UTC Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.652451 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.652498 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.652507 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.652518 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.652527 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:28Z","lastTransitionTime":"2026-01-31T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.754218 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.754248 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.754256 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.754271 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.754281 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:28Z","lastTransitionTime":"2026-01-31T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.856277 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.856301 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.856309 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.856321 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.856328 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:28Z","lastTransitionTime":"2026-01-31T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.958033 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.958060 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.958100 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.958114 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:28 crc kubenswrapper[4783]: I0131 09:05:28.958129 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:28Z","lastTransitionTime":"2026-01-31T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.059959 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.059991 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.059999 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.060009 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.060017 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:29Z","lastTransitionTime":"2026-01-31T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.161692 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.161718 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.161728 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.161739 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.161748 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:29Z","lastTransitionTime":"2026-01-31T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.263264 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.263304 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.263316 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.263330 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.263340 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:29Z","lastTransitionTime":"2026-01-31T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.365127 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.365176 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.365186 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.365198 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.365206 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:29Z","lastTransitionTime":"2026-01-31T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.466595 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.466630 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.466639 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.466651 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.466660 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:29Z","lastTransitionTime":"2026-01-31T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.568248 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.568273 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.568281 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.568292 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.568300 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:29Z","lastTransitionTime":"2026-01-31T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.629442 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 11:12:16.272059543 +0000 UTC Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.644754 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.644759 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:29 crc kubenswrapper[4783]: E0131 09:05:29.644866 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.644891 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.644768 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:29 crc kubenswrapper[4783]: E0131 09:05:29.644929 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:29 crc kubenswrapper[4783]: E0131 09:05:29.645016 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:29 crc kubenswrapper[4783]: E0131 09:05:29.645088 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.654282 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.662277 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.670057 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.670246 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.670271 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.670279 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.670290 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.670299 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:29Z","lastTransitionTime":"2026-01-31T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.678368 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.685965 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.694012 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.702332 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.714462 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.722098 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.730682 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.737521 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.748954 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:21Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:21.410784 6203 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}\\\\nI0131 09:05:21.410801 6203 services_controller.go:360] Finished syncing service certified-operators on namespace openshift-marketplace for network=default : 1.472455ms\\\\nI0131 09:05:21.411369 6203 obj_retry.go:551] Creating *factory.egressNode crc took: 1.654498ms\\\\nI0131 09:05:21.411394 6203 factory.go:1336] Added *v1.Node event handler 7\\\\nI0131 09:05:21.411419 6203 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0131 09:05:21.411610 6203 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0131 09:05:21.411679 6203 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0131 09:05:21.411706 6203 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:05:21.411729 6203 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:05:21.411776 6203 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.755432 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.762403 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.769828 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.771376 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.771405 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.771415 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.771427 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.771437 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:29Z","lastTransitionTime":"2026-01-31T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.779340 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.785298 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.874107 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.874159 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.874194 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.874215 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.874225 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:29Z","lastTransitionTime":"2026-01-31T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.976220 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.976250 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.976259 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.976273 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.976281 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:29Z","lastTransitionTime":"2026-01-31T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:29 crc kubenswrapper[4783]: I0131 09:05:29.998886 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs\") pod \"network-metrics-daemon-xg6x2\" (UID: \"84961ed7-35f8-4e6a-987c-cabb84cf7268\") " pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:29 crc kubenswrapper[4783]: E0131 09:05:29.998993 4783 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:05:29 crc kubenswrapper[4783]: E0131 09:05:29.999082 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs podName:84961ed7-35f8-4e6a-987c-cabb84cf7268 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:33.999064452 +0000 UTC m=+44.667747930 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs") pod "network-metrics-daemon-xg6x2" (UID: "84961ed7-35f8-4e6a-987c-cabb84cf7268") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.078016 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.078038 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.078047 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.078058 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.078066 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:30Z","lastTransitionTime":"2026-01-31T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.179609 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.179631 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.179639 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.179651 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.179658 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:30Z","lastTransitionTime":"2026-01-31T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.281700 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.281725 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.281733 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.281745 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.281753 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:30Z","lastTransitionTime":"2026-01-31T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.383325 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.383356 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.383364 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.383374 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.383384 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:30Z","lastTransitionTime":"2026-01-31T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.419345 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.419372 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.419379 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.419388 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.419396 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:30Z","lastTransitionTime":"2026-01-31T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:30 crc kubenswrapper[4783]: E0131 09:05:30.428096 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.430381 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.430416 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.430425 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.430439 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.430449 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:30Z","lastTransitionTime":"2026-01-31T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:30 crc kubenswrapper[4783]: E0131 09:05:30.438424 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.440628 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.440657 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.440667 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.440677 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.440686 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:30Z","lastTransitionTime":"2026-01-31T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:30 crc kubenswrapper[4783]: E0131 09:05:30.448630 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.450547 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.450574 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.450584 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.450594 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.450600 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:30Z","lastTransitionTime":"2026-01-31T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:30 crc kubenswrapper[4783]: E0131 09:05:30.457994 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.459823 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.459850 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.459859 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.459870 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.459877 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:30Z","lastTransitionTime":"2026-01-31T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:30 crc kubenswrapper[4783]: E0131 09:05:30.467022 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:30 crc kubenswrapper[4783]: E0131 09:05:30.467121 4783 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.484943 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.484963 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.484973 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.484986 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.484994 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:30Z","lastTransitionTime":"2026-01-31T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.586824 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.586849 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.586857 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.586888 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.586897 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:30Z","lastTransitionTime":"2026-01-31T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.629857 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 20:41:46.054576273 +0000 UTC Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.688562 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.688601 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.688609 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.688618 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.688626 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:30Z","lastTransitionTime":"2026-01-31T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.789616 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.789644 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.789652 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.789660 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.789667 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:30Z","lastTransitionTime":"2026-01-31T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.890875 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.890907 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.890915 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.890926 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.890934 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:30Z","lastTransitionTime":"2026-01-31T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.993010 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.993040 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.993052 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.993062 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:30 crc kubenswrapper[4783]: I0131 09:05:30.993070 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:30Z","lastTransitionTime":"2026-01-31T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.094106 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.094131 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.094140 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.094150 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.094157 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:31Z","lastTransitionTime":"2026-01-31T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.195844 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.195872 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.195880 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.195891 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.195899 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:31Z","lastTransitionTime":"2026-01-31T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.297820 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.297854 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.297865 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.297879 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.297889 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:31Z","lastTransitionTime":"2026-01-31T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.399357 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.399395 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.399405 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.399416 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.399426 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:31Z","lastTransitionTime":"2026-01-31T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.501251 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.501280 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.501289 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.501299 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.501308 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:31Z","lastTransitionTime":"2026-01-31T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.602318 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.602345 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.602353 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.602362 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.602368 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:31Z","lastTransitionTime":"2026-01-31T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.630882 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 19:36:40.191023499 +0000 UTC Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.645290 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:31 crc kubenswrapper[4783]: E0131 09:05:31.645427 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.645726 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.645790 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:31 crc kubenswrapper[4783]: E0131 09:05:31.645864 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:31 crc kubenswrapper[4783]: E0131 09:05:31.645890 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.645987 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:31 crc kubenswrapper[4783]: E0131 09:05:31.646296 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.703964 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.703993 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.704002 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.704013 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.704025 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:31Z","lastTransitionTime":"2026-01-31T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.805531 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.805578 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.805589 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.805603 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.805615 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:31Z","lastTransitionTime":"2026-01-31T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.907104 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.907137 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.907149 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.907184 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:31 crc kubenswrapper[4783]: I0131 09:05:31.907195 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:31Z","lastTransitionTime":"2026-01-31T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.008624 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.008692 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.008704 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.008720 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.008735 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:32Z","lastTransitionTime":"2026-01-31T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.109952 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.109977 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.109986 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.109996 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.110005 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:32Z","lastTransitionTime":"2026-01-31T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.211852 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.211882 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.211891 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.211902 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.211912 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:32Z","lastTransitionTime":"2026-01-31T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.313784 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.313816 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.313825 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.313835 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.313844 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:32Z","lastTransitionTime":"2026-01-31T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.415384 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.415418 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.415427 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.415437 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.415444 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:32Z","lastTransitionTime":"2026-01-31T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.517046 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.517083 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.517091 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.517102 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.517112 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:32Z","lastTransitionTime":"2026-01-31T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.618126 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.618158 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.618184 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.618196 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.618204 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:32Z","lastTransitionTime":"2026-01-31T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.631577 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 09:45:13.596483259 +0000 UTC Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.719440 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.719466 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.719474 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.719485 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.719494 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:32Z","lastTransitionTime":"2026-01-31T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.821151 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.821215 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.821231 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.821244 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.821256 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:32Z","lastTransitionTime":"2026-01-31T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.923212 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.923241 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.923249 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.923261 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:32 crc kubenswrapper[4783]: I0131 09:05:32.923269 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:32Z","lastTransitionTime":"2026-01-31T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.025317 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.025350 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.025373 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.025383 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.025390 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:33Z","lastTransitionTime":"2026-01-31T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.127200 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.127238 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.127248 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.127263 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.127274 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:33Z","lastTransitionTime":"2026-01-31T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.229184 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.229212 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.229221 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.229232 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.229240 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:33Z","lastTransitionTime":"2026-01-31T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.331056 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.331086 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.331094 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.331103 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.331111 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:33Z","lastTransitionTime":"2026-01-31T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.432220 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.432248 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.432256 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.432266 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.432273 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:33Z","lastTransitionTime":"2026-01-31T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.533979 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.534012 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.534021 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.534031 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.534037 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:33Z","lastTransitionTime":"2026-01-31T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.632489 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 07:13:37.524821224 +0000 UTC Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.635905 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.635942 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.635951 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.635961 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.635968 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:33Z","lastTransitionTime":"2026-01-31T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.645453 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.645481 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.645498 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.645452 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:33 crc kubenswrapper[4783]: E0131 09:05:33.645560 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:33 crc kubenswrapper[4783]: E0131 09:05:33.645621 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:33 crc kubenswrapper[4783]: E0131 09:05:33.645704 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:33 crc kubenswrapper[4783]: E0131 09:05:33.646035 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.646188 4783 scope.go:117] "RemoveContainer" containerID="0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.739276 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.739315 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.739326 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.739339 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.739350 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:33Z","lastTransitionTime":"2026-01-31T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.833503 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovnkube-controller/1.log" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.835986 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerStarted","Data":"f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7"} Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.836203 4783 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.840706 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.840736 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.840748 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.840764 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.840775 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:33Z","lastTransitionTime":"2026-01-31T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.847753 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.860186 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.876615 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.889031 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.900833 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.918865 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.933077 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.943304 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.943340 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.943350 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.943363 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.943376 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:33Z","lastTransitionTime":"2026-01-31T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.944214 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.952855 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.961091 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.969086 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.976243 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.983700 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:33 crc kubenswrapper[4783]: I0131 09:05:33.995518 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:21Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:21.410784 6203 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}\\\\nI0131 09:05:21.410801 6203 services_controller.go:360] Finished syncing service certified-operators on namespace openshift-marketplace for network=default : 1.472455ms\\\\nI0131 09:05:21.411369 6203 obj_retry.go:551] Creating *factory.egressNode crc took: 1.654498ms\\\\nI0131 09:05:21.411394 6203 factory.go:1336] Added *v1.Node event handler 7\\\\nI0131 09:05:21.411419 6203 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0131 09:05:21.411610 6203 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0131 09:05:21.411679 6203 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0131 09:05:21.411706 6203 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:05:21.411729 6203 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:05:21.411776 6203 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.003528 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.011694 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.019697 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.029269 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs\") pod \"network-metrics-daemon-xg6x2\" (UID: \"84961ed7-35f8-4e6a-987c-cabb84cf7268\") " pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:34 crc kubenswrapper[4783]: E0131 09:05:34.029458 4783 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:05:34 crc kubenswrapper[4783]: E0131 09:05:34.029525 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs podName:84961ed7-35f8-4e6a-987c-cabb84cf7268 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:42.029509113 +0000 UTC m=+52.698192581 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs") pod "network-metrics-daemon-xg6x2" (UID: "84961ed7-35f8-4e6a-987c-cabb84cf7268") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.045915 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.045962 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.045974 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.045997 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.046011 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:34Z","lastTransitionTime":"2026-01-31T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.148308 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.148369 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.148383 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.148405 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.148423 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:34Z","lastTransitionTime":"2026-01-31T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.251102 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.251429 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.251440 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.251457 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.251470 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:34Z","lastTransitionTime":"2026-01-31T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.353342 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.353375 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.353385 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.353399 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.353409 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:34Z","lastTransitionTime":"2026-01-31T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.454766 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.454808 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.454820 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.454832 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.454841 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:34Z","lastTransitionTime":"2026-01-31T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.556874 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.556902 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.556911 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.556922 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.556929 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:34Z","lastTransitionTime":"2026-01-31T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.632619 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 17:45:00.979874806 +0000 UTC Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.658468 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.658497 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.658506 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.658519 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.658529 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:34Z","lastTransitionTime":"2026-01-31T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.760844 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.760880 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.760891 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.760902 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.760912 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:34Z","lastTransitionTime":"2026-01-31T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.844570 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovnkube-controller/2.log" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.845372 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovnkube-controller/1.log" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.848520 4783 generic.go:334] "Generic (PLEG): container finished" podID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerID="f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7" exitCode=1 Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.848576 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerDied","Data":"f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7"} Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.848640 4783 scope.go:117] "RemoveContainer" containerID="0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.850547 4783 scope.go:117] "RemoveContainer" containerID="f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7" Jan 31 09:05:34 crc kubenswrapper[4783]: E0131 09:05:34.850751 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.860928 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.863369 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.863391 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.863400 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.863411 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.863419 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:34Z","lastTransitionTime":"2026-01-31T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.870424 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.878406 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.886764 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.895154 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.908127 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.915759 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.924919 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.932861 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.940309 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.948035 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.955250 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.965614 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.965639 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.965648 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.965659 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.965669 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:34Z","lastTransitionTime":"2026-01-31T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.967946 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.975423 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.983420 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:34 crc kubenswrapper[4783]: I0131 09:05:34.995790 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a83decb2a7b594cb5269833543325abde02bc651d8129d7e6ed188cbdd01fdf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:21Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:21.410784 6203 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}\\\\nI0131 09:05:21.410801 6203 services_controller.go:360] Finished syncing service certified-operators on namespace openshift-marketplace for network=default : 1.472455ms\\\\nI0131 09:05:21.411369 6203 obj_retry.go:551] Creating *factory.egressNode crc took: 1.654498ms\\\\nI0131 09:05:21.411394 6203 factory.go:1336] Added *v1.Node event handler 7\\\\nI0131 09:05:21.411419 6203 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0131 09:05:21.411610 6203 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0131 09:05:21.411679 6203 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0131 09:05:21.411706 6203 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:05:21.411729 6203 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:05:21.411776 6203 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:34Z\\\",\\\"message\\\":\\\"Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266650 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266658 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266303 6428 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-vr882\\\\nI0131 09:05:34.266332 6428 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-6h2bb\\\\nI0131 09:05:34.266712 6428 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-6h2bb in node crc\\\\nI0131 09:05:34\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.002998 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.051892 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.067141 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.067194 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.067204 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.067214 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.067221 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:35Z","lastTransitionTime":"2026-01-31T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.169490 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.169523 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.169533 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.169560 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.169572 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:35Z","lastTransitionTime":"2026-01-31T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.271440 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.271466 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.271475 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.271486 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.271495 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:35Z","lastTransitionTime":"2026-01-31T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.373238 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.373441 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.373504 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.373576 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.373636 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:35Z","lastTransitionTime":"2026-01-31T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.474786 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.474895 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.474982 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.475050 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.475102 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:35Z","lastTransitionTime":"2026-01-31T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.577566 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.577623 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.577633 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.577650 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.577659 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:35Z","lastTransitionTime":"2026-01-31T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.633255 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 20:43:12.557822894 +0000 UTC Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.644714 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.644757 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:35 crc kubenswrapper[4783]: E0131 09:05:35.644807 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.644874 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:35 crc kubenswrapper[4783]: E0131 09:05:35.644928 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:35 crc kubenswrapper[4783]: E0131 09:05:35.644990 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.645094 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:35 crc kubenswrapper[4783]: E0131 09:05:35.645242 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.678913 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.678998 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.679055 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.679115 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.679196 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:35Z","lastTransitionTime":"2026-01-31T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.781021 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.781048 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.781059 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.781070 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.781079 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:35Z","lastTransitionTime":"2026-01-31T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.852094 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovnkube-controller/2.log" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.854989 4783 scope.go:117] "RemoveContainer" containerID="f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7" Jan 31 09:05:35 crc kubenswrapper[4783]: E0131 09:05:35.855117 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.863881 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.872058 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.882441 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.882475 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.882484 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.882495 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.882505 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:35Z","lastTransitionTime":"2026-01-31T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.883203 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.891537 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.898889 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.911669 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.920340 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.928095 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.935207 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.946868 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:34Z\\\",\\\"message\\\":\\\"Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266650 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266658 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266303 6428 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-vr882\\\\nI0131 09:05:34.266332 6428 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-6h2bb\\\\nI0131 09:05:34.266712 6428 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-6h2bb in node crc\\\\nI0131 09:05:34\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.953706 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.960294 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.967974 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.975983 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.984234 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.986090 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.986125 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.986136 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.986152 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.986179 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:35Z","lastTransitionTime":"2026-01-31T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:35 crc kubenswrapper[4783]: I0131 09:05:35.992906 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.006712 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.089032 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.089070 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.089078 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.089096 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.089105 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:36Z","lastTransitionTime":"2026-01-31T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.190806 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.190838 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.190848 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.190871 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.190882 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:36Z","lastTransitionTime":"2026-01-31T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.292865 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.292904 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.292912 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.292926 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.292936 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:36Z","lastTransitionTime":"2026-01-31T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.394888 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.394940 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.394950 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.394962 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.394971 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:36Z","lastTransitionTime":"2026-01-31T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.496787 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.496815 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.496823 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.496835 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.496843 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:36Z","lastTransitionTime":"2026-01-31T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.598125 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.598153 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.598176 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.598186 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.598193 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:36Z","lastTransitionTime":"2026-01-31T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.633921 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 13:41:10.387471998 +0000 UTC Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.699860 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.699906 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.699943 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.699954 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.699961 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:36Z","lastTransitionTime":"2026-01-31T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.801731 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.801762 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.801771 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.801787 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.801799 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:36Z","lastTransitionTime":"2026-01-31T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.859536 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.866158 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.869117 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.877814 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.885800 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.894371 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.902278 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.903627 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.903679 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.903691 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.903705 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.903715 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:36Z","lastTransitionTime":"2026-01-31T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.909359 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.923109 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.931474 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.940897 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.948910 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.960633 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:34Z\\\",\\\"message\\\":\\\"Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266650 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266658 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266303 6428 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-vr882\\\\nI0131 09:05:34.266332 6428 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-6h2bb\\\\nI0131 09:05:34.266712 6428 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-6h2bb in node crc\\\\nI0131 09:05:34\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.967020 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.977904 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.985019 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.992419 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:36 crc kubenswrapper[4783]: I0131 09:05:36.999524 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.005478 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.005528 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.005550 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.005563 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.005572 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:37Z","lastTransitionTime":"2026-01-31T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.007678 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.107009 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.107036 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.107047 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.107059 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.107069 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:37Z","lastTransitionTime":"2026-01-31T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.209016 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.209046 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.209057 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.209079 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.209087 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:37Z","lastTransitionTime":"2026-01-31T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.310932 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.310990 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.311001 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.311015 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.311022 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:37Z","lastTransitionTime":"2026-01-31T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.412684 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.412711 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.412720 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.412731 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.412738 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:37Z","lastTransitionTime":"2026-01-31T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.514509 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.514531 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.514549 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.514564 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.514573 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:37Z","lastTransitionTime":"2026-01-31T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.616319 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.616343 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.616351 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.616362 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.616371 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:37Z","lastTransitionTime":"2026-01-31T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.634196 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 00:05:00.857234556 +0000 UTC Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.645509 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.645517 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.645516 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.645566 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:37 crc kubenswrapper[4783]: E0131 09:05:37.645650 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:37 crc kubenswrapper[4783]: E0131 09:05:37.645690 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:37 crc kubenswrapper[4783]: E0131 09:05:37.645743 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:37 crc kubenswrapper[4783]: E0131 09:05:37.645789 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.717971 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.717994 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.718003 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.718012 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.718019 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:37Z","lastTransitionTime":"2026-01-31T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.820188 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.820217 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.820227 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.820238 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.820245 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:37Z","lastTransitionTime":"2026-01-31T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.921647 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.921674 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.921682 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.921692 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:37 crc kubenswrapper[4783]: I0131 09:05:37.921699 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:37Z","lastTransitionTime":"2026-01-31T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.023425 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.023474 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.023483 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.023497 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.023506 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:38Z","lastTransitionTime":"2026-01-31T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.125627 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.125655 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.125663 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.125675 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.125682 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:38Z","lastTransitionTime":"2026-01-31T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.226993 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.227026 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.227054 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.227066 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.227074 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:38Z","lastTransitionTime":"2026-01-31T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.328870 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.328915 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.328926 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.328937 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.328945 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:38Z","lastTransitionTime":"2026-01-31T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.431103 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.431151 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.431186 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.431204 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.431215 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:38Z","lastTransitionTime":"2026-01-31T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.533124 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.533155 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.533189 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.533200 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.533207 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:38Z","lastTransitionTime":"2026-01-31T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.634505 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 10:38:40.408002846 +0000 UTC Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.634736 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.634758 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.634767 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.634779 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.634786 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:38Z","lastTransitionTime":"2026-01-31T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.736801 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.736827 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.736836 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.736846 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.736854 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:38Z","lastTransitionTime":"2026-01-31T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.839038 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.839066 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.839075 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.839085 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.839093 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:38Z","lastTransitionTime":"2026-01-31T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.941122 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.941149 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.941179 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.941193 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:38 crc kubenswrapper[4783]: I0131 09:05:38.941203 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:38Z","lastTransitionTime":"2026-01-31T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.043184 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.043250 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.043262 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.043284 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.043295 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:39Z","lastTransitionTime":"2026-01-31T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.145140 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.145190 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.145201 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.145212 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.145220 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:39Z","lastTransitionTime":"2026-01-31T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.247071 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.247098 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.247124 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.247136 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.247144 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:39Z","lastTransitionTime":"2026-01-31T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.348577 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.348606 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.348617 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.348627 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.348635 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:39Z","lastTransitionTime":"2026-01-31T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.451091 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.451130 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.451142 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.451153 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.451191 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:39Z","lastTransitionTime":"2026-01-31T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.475216 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.475356 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:11.47533899 +0000 UTC m=+82.144022459 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.552895 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.552916 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.552924 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.552934 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.552940 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:39Z","lastTransitionTime":"2026-01-31T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.576381 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.576405 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.576426 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.576442 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.576531 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.576550 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.576559 4783 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.576585 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:06:11.576578003 +0000 UTC m=+82.245261462 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.576613 4783 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.576632 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:06:11.576626525 +0000 UTC m=+82.245309993 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.576666 4783 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.576683 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:06:11.576678583 +0000 UTC m=+82.245362052 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.576716 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.576724 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.576730 4783 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.576746 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:06:11.57674104 +0000 UTC m=+82.245424508 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.634566 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 09:06:02.18279732 +0000 UTC Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.644844 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.644886 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.644972 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.645138 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.645204 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.645301 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.645354 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:39 crc kubenswrapper[4783]: E0131 09:05:39.645462 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.653590 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.654519 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.654617 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.654676 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.654733 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.654799 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:39Z","lastTransitionTime":"2026-01-31T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.663120 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.671202 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.678347 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.690831 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.698356 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.704600 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.715755 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.728770 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:34Z\\\",\\\"message\\\":\\\"Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266650 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266658 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266303 6428 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-vr882\\\\nI0131 09:05:34.266332 6428 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-6h2bb\\\\nI0131 09:05:34.266712 6428 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-6h2bb in node crc\\\\nI0131 09:05:34\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.735700 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.742125 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.749123 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.756620 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.756644 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.756652 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.756665 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.756675 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:39Z","lastTransitionTime":"2026-01-31T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.757600 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.765507 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.773067 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.780834 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.788058 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc94916-57ef-4e93-810b-8d1772d40130\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acd5c1f71bcf95041d31c6c0858e8f6b02aac2d44e208a25228924199cfa8527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab2f851c6a3148330f3e19352e904c099dc3b3e68f664cbc62aec049e50e5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae0cca9ef3d64d4323f775d56154dd4d95078b142c429e8431580734b7fa739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.796478 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.858714 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.858745 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.858756 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.858770 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.858781 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:39Z","lastTransitionTime":"2026-01-31T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.960144 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.960191 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.960202 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.960215 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:39 crc kubenswrapper[4783]: I0131 09:05:39.960225 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:39Z","lastTransitionTime":"2026-01-31T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.063859 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.063893 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.063905 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.063919 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.063932 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:40Z","lastTransitionTime":"2026-01-31T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.165786 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.165816 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.165825 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.165839 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.165846 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:40Z","lastTransitionTime":"2026-01-31T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.267476 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.267512 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.267522 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.267572 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.267583 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:40Z","lastTransitionTime":"2026-01-31T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.369661 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.369782 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.369850 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.369907 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.369963 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:40Z","lastTransitionTime":"2026-01-31T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.472399 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.472435 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.472444 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.472458 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.472466 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:40Z","lastTransitionTime":"2026-01-31T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.574241 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.574406 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.574474 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.574538 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.574604 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:40Z","lastTransitionTime":"2026-01-31T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.635502 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 17:22:06.937244203 +0000 UTC Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.676503 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.676528 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.676537 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.676552 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.676579 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:40Z","lastTransitionTime":"2026-01-31T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.778524 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.778564 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.778573 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.778583 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.778773 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:40Z","lastTransitionTime":"2026-01-31T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.821744 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.821764 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.821772 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.821781 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.821788 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:40Z","lastTransitionTime":"2026-01-31T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:40 crc kubenswrapper[4783]: E0131 09:05:40.829752 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.831949 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.831989 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.831998 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.832021 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.832028 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:40Z","lastTransitionTime":"2026-01-31T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:40 crc kubenswrapper[4783]: E0131 09:05:40.839476 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.841504 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.841528 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.841537 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.841569 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.841578 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:40Z","lastTransitionTime":"2026-01-31T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:40 crc kubenswrapper[4783]: E0131 09:05:40.848723 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.850801 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.850821 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.850829 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.850856 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.850863 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:40Z","lastTransitionTime":"2026-01-31T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:40 crc kubenswrapper[4783]: E0131 09:05:40.858648 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.860753 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.860777 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.860805 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.860815 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.860822 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:40Z","lastTransitionTime":"2026-01-31T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:40 crc kubenswrapper[4783]: E0131 09:05:40.869874 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:40 crc kubenswrapper[4783]: E0131 09:05:40.869988 4783 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.879827 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.879883 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.879893 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.879903 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.879911 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:40Z","lastTransitionTime":"2026-01-31T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.981116 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.981154 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.981177 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.981187 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:40 crc kubenswrapper[4783]: I0131 09:05:40.981195 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:40Z","lastTransitionTime":"2026-01-31T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.083410 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.083427 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.083434 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.083443 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.083451 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:41Z","lastTransitionTime":"2026-01-31T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.185447 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.185491 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.185501 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.185513 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.185520 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:41Z","lastTransitionTime":"2026-01-31T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.287205 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.287246 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.287258 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.287269 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.287278 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:41Z","lastTransitionTime":"2026-01-31T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.388824 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.388870 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.388878 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.388887 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.388895 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:41Z","lastTransitionTime":"2026-01-31T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.490373 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.490397 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.490404 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.490413 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.490420 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:41Z","lastTransitionTime":"2026-01-31T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.592282 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.592297 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.592305 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.592314 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.592320 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:41Z","lastTransitionTime":"2026-01-31T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.636215 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 19:06:53.191191465 +0000 UTC Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.644760 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.644835 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:41 crc kubenswrapper[4783]: E0131 09:05:41.644865 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.644912 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.644941 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:41 crc kubenswrapper[4783]: E0131 09:05:41.645022 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:41 crc kubenswrapper[4783]: E0131 09:05:41.645104 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:41 crc kubenswrapper[4783]: E0131 09:05:41.645153 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.693639 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.693668 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.693678 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.693689 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.693696 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:41Z","lastTransitionTime":"2026-01-31T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.794805 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.794833 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.794841 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.794853 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.794861 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:41Z","lastTransitionTime":"2026-01-31T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.896724 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.896759 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.896767 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.896779 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.896789 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:41Z","lastTransitionTime":"2026-01-31T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.998505 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.998597 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.998614 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.998638 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:41 crc kubenswrapper[4783]: I0131 09:05:41.998650 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:41Z","lastTransitionTime":"2026-01-31T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.096364 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs\") pod \"network-metrics-daemon-xg6x2\" (UID: \"84961ed7-35f8-4e6a-987c-cabb84cf7268\") " pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:42 crc kubenswrapper[4783]: E0131 09:05:42.096535 4783 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:05:42 crc kubenswrapper[4783]: E0131 09:05:42.096611 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs podName:84961ed7-35f8-4e6a-987c-cabb84cf7268 nodeName:}" failed. No retries permitted until 2026-01-31 09:05:58.096593291 +0000 UTC m=+68.765276769 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs") pod "network-metrics-daemon-xg6x2" (UID: "84961ed7-35f8-4e6a-987c-cabb84cf7268") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.100502 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.100541 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.100565 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.100582 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.100591 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:42Z","lastTransitionTime":"2026-01-31T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.202126 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.202178 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.202190 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.202203 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.202214 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:42Z","lastTransitionTime":"2026-01-31T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.303712 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.303743 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.303751 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.303764 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.303775 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:42Z","lastTransitionTime":"2026-01-31T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.405146 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.405243 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.405260 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.405279 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.405292 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:42Z","lastTransitionTime":"2026-01-31T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.507373 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.507408 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.507418 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.507432 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.507442 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:42Z","lastTransitionTime":"2026-01-31T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.609014 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.609051 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.609083 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.609097 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.609106 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:42Z","lastTransitionTime":"2026-01-31T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.636584 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 01:11:06.014204688 +0000 UTC Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.710639 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.710667 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.710677 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.710687 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.710695 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:42Z","lastTransitionTime":"2026-01-31T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.812609 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.812640 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.812649 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.812661 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.812671 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:42Z","lastTransitionTime":"2026-01-31T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.914723 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.914750 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.914759 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.914770 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:42 crc kubenswrapper[4783]: I0131 09:05:42.914779 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:42Z","lastTransitionTime":"2026-01-31T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.016097 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.016125 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.016133 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.016143 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.016151 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:43Z","lastTransitionTime":"2026-01-31T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.117330 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.117357 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.117368 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.117377 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.117384 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:43Z","lastTransitionTime":"2026-01-31T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.219406 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.219450 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.219459 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.219474 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.219484 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:43Z","lastTransitionTime":"2026-01-31T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.320898 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.320931 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.320942 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.320953 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.320962 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:43Z","lastTransitionTime":"2026-01-31T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.422674 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.422707 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.422716 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.422732 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.422743 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:43Z","lastTransitionTime":"2026-01-31T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.524935 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.524965 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.524974 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.524990 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.525003 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:43Z","lastTransitionTime":"2026-01-31T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.626566 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.626597 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.626608 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.626619 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.626627 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:43Z","lastTransitionTime":"2026-01-31T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.636888 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 21:53:02.563547374 +0000 UTC Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.645207 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:43 crc kubenswrapper[4783]: E0131 09:05:43.645296 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.645351 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.645392 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.645403 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:43 crc kubenswrapper[4783]: E0131 09:05:43.645449 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:43 crc kubenswrapper[4783]: E0131 09:05:43.645558 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:43 crc kubenswrapper[4783]: E0131 09:05:43.645606 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.728434 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.728469 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.728478 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.728493 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.728501 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:43Z","lastTransitionTime":"2026-01-31T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.830830 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.830871 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.830881 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.830895 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.830906 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:43Z","lastTransitionTime":"2026-01-31T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.932311 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.932371 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.932384 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.932403 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:43 crc kubenswrapper[4783]: I0131 09:05:43.932415 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:43Z","lastTransitionTime":"2026-01-31T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.034294 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.034420 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.034493 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.034564 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.034629 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:44Z","lastTransitionTime":"2026-01-31T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.136482 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.136538 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.136557 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.136571 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.136581 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:44Z","lastTransitionTime":"2026-01-31T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.238449 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.238473 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.238482 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.238494 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.238504 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:44Z","lastTransitionTime":"2026-01-31T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.340428 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.340475 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.340486 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.340498 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.340506 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:44Z","lastTransitionTime":"2026-01-31T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.442226 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.442257 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.442266 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.442279 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.442288 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:44Z","lastTransitionTime":"2026-01-31T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.543981 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.544054 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.544065 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.544077 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.544087 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:44Z","lastTransitionTime":"2026-01-31T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.637497 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 21:05:51.326283461 +0000 UTC Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.646088 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.646134 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.646144 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.646180 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.646190 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:44Z","lastTransitionTime":"2026-01-31T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.747752 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.747773 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.747780 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.747791 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.747799 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:44Z","lastTransitionTime":"2026-01-31T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.849660 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.849702 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.849712 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.849725 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.849734 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:44Z","lastTransitionTime":"2026-01-31T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.951958 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.951987 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.951995 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.952004 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:44 crc kubenswrapper[4783]: I0131 09:05:44.952014 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:44Z","lastTransitionTime":"2026-01-31T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.054209 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.054237 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.054246 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.054258 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.054271 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:45Z","lastTransitionTime":"2026-01-31T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.156105 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.156136 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.156147 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.156178 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.156186 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:45Z","lastTransitionTime":"2026-01-31T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.257803 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.257837 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.257847 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.257861 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.257872 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:45Z","lastTransitionTime":"2026-01-31T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.359744 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.359773 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.359783 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.359792 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.359799 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:45Z","lastTransitionTime":"2026-01-31T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.461299 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.461330 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.461342 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.461353 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.461361 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:45Z","lastTransitionTime":"2026-01-31T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.563756 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.563790 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.563799 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.563810 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.563817 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:45Z","lastTransitionTime":"2026-01-31T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.638253 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 10:36:15.953774094 +0000 UTC Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.645576 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.645605 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.645639 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:45 crc kubenswrapper[4783]: E0131 09:05:45.645749 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.645776 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:45 crc kubenswrapper[4783]: E0131 09:05:45.645852 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:45 crc kubenswrapper[4783]: E0131 09:05:45.645904 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:45 crc kubenswrapper[4783]: E0131 09:05:45.645966 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.665106 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.665132 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.665140 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.665152 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.665175 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:45Z","lastTransitionTime":"2026-01-31T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.767041 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.767073 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.767091 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.767103 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.767111 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:45Z","lastTransitionTime":"2026-01-31T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.868849 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.868885 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.868892 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.868906 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.868916 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:45Z","lastTransitionTime":"2026-01-31T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.970942 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.970994 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.971011 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.971027 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:45 crc kubenswrapper[4783]: I0131 09:05:45.971040 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:45Z","lastTransitionTime":"2026-01-31T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.072097 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.072123 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.072132 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.072141 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.072148 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:46Z","lastTransitionTime":"2026-01-31T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.173217 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.173246 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.173254 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.173263 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.173271 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:46Z","lastTransitionTime":"2026-01-31T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.275070 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.275092 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.275100 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.275109 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.275117 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:46Z","lastTransitionTime":"2026-01-31T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.376384 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.376406 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.376414 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.376423 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.376431 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:46Z","lastTransitionTime":"2026-01-31T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.478390 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.478421 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.478431 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.478441 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.478451 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:46Z","lastTransitionTime":"2026-01-31T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.579511 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.579541 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.579559 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.579570 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.579579 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:46Z","lastTransitionTime":"2026-01-31T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.638719 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 11:56:58.744661819 +0000 UTC Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.681634 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.681690 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.681701 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.681723 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.681742 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:46Z","lastTransitionTime":"2026-01-31T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.783976 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.784007 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.784017 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.784028 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.784039 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:46Z","lastTransitionTime":"2026-01-31T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.886277 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.886312 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.886321 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.886335 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.886347 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:46Z","lastTransitionTime":"2026-01-31T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.988536 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.988589 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.988605 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.988617 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:46 crc kubenswrapper[4783]: I0131 09:05:46.988625 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:46Z","lastTransitionTime":"2026-01-31T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.090353 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.090396 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.090404 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.090415 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.090422 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:47Z","lastTransitionTime":"2026-01-31T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.192728 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.192757 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.192767 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.192777 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.192786 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:47Z","lastTransitionTime":"2026-01-31T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.295208 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.295231 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.295240 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.295251 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.295468 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:47Z","lastTransitionTime":"2026-01-31T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.397317 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.397340 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.397347 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.397358 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.397364 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:47Z","lastTransitionTime":"2026-01-31T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.499358 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.499391 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.499401 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.499413 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.499421 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:47Z","lastTransitionTime":"2026-01-31T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.601574 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.601612 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.601623 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.601638 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.601647 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:47Z","lastTransitionTime":"2026-01-31T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.639344 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 02:40:51.491260845 +0000 UTC Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.644798 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.644853 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:47 crc kubenswrapper[4783]: E0131 09:05:47.644920 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.644950 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.644964 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:47 crc kubenswrapper[4783]: E0131 09:05:47.645034 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:47 crc kubenswrapper[4783]: E0131 09:05:47.645101 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:47 crc kubenswrapper[4783]: E0131 09:05:47.645211 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.703676 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.703706 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.703714 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.703725 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.703749 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:47Z","lastTransitionTime":"2026-01-31T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.804903 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.804935 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.804944 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.804955 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.804963 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:47Z","lastTransitionTime":"2026-01-31T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.906471 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.906498 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.906527 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.906539 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:47 crc kubenswrapper[4783]: I0131 09:05:47.906548 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:47Z","lastTransitionTime":"2026-01-31T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.008135 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.008180 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.008188 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.008199 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.008207 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:48Z","lastTransitionTime":"2026-01-31T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.109772 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.109816 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.109829 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.109839 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.109845 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:48Z","lastTransitionTime":"2026-01-31T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.211376 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.211432 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.211444 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.211456 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.211462 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:48Z","lastTransitionTime":"2026-01-31T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.313015 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.313040 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.313047 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.313055 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.313062 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:48Z","lastTransitionTime":"2026-01-31T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.415991 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.416060 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.416071 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.416092 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.416111 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:48Z","lastTransitionTime":"2026-01-31T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.518076 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.518103 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.518112 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.518145 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.518152 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:48Z","lastTransitionTime":"2026-01-31T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.619713 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.619762 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.619774 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.619785 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.619795 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:48Z","lastTransitionTime":"2026-01-31T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.640295 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 21:28:09.942718169 +0000 UTC Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.646049 4783 scope.go:117] "RemoveContainer" containerID="f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7" Jan 31 09:05:48 crc kubenswrapper[4783]: E0131 09:05:48.646209 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.721741 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.721777 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.721787 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.721803 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.721813 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:48Z","lastTransitionTime":"2026-01-31T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.823834 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.823868 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.823877 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.823889 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.823898 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:48Z","lastTransitionTime":"2026-01-31T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.925546 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.925582 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.925602 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.925614 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:48 crc kubenswrapper[4783]: I0131 09:05:48.925623 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:48Z","lastTransitionTime":"2026-01-31T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.028056 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.028118 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.028131 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.028154 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.028187 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:49Z","lastTransitionTime":"2026-01-31T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.132343 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.132372 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.132382 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.132397 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.132407 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:49Z","lastTransitionTime":"2026-01-31T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.234740 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.234790 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.234800 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.234820 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.234832 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:49Z","lastTransitionTime":"2026-01-31T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.336419 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.336463 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.336478 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.336494 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.336506 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:49Z","lastTransitionTime":"2026-01-31T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.438576 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.438602 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.438612 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.438622 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.438630 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:49Z","lastTransitionTime":"2026-01-31T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.540691 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.540714 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.540722 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.540731 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.540739 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:49Z","lastTransitionTime":"2026-01-31T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.640463 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 06:24:40.911470226 +0000 UTC Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.642098 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.642137 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.642147 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.642171 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.642181 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:49Z","lastTransitionTime":"2026-01-31T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.644679 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:49 crc kubenswrapper[4783]: E0131 09:05:49.644773 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.644797 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:49 crc kubenswrapper[4783]: E0131 09:05:49.644865 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.644883 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:49 crc kubenswrapper[4783]: E0131 09:05:49.644956 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.644895 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:49 crc kubenswrapper[4783]: E0131 09:05:49.645106 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.655795 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.664229 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.671567 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.678984 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.691095 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:34Z\\\",\\\"message\\\":\\\"Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266650 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266658 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266303 6428 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-vr882\\\\nI0131 09:05:34.266332 6428 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-6h2bb\\\\nI0131 09:05:34.266712 6428 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-6h2bb in node crc\\\\nI0131 09:05:34\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.701409 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.708672 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.716861 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.725074 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.732766 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.741064 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.743501 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.743611 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.743676 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.743739 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.743802 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:49Z","lastTransitionTime":"2026-01-31T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.748948 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.756196 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc94916-57ef-4e93-810b-8d1772d40130\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acd5c1f71bcf95041d31c6c0858e8f6b02aac2d44e208a25228924199cfa8527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab2f851c6a3148330f3e19352e904c099dc3b3e68f664cbc62aec049e50e5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae0cca9ef3d64d4323f775d56154dd4d95078b142c429e8431580734b7fa739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.769247 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.777236 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.786202 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.794302 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.801464 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:49Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.845811 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.846123 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.846134 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.846149 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.846179 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:49Z","lastTransitionTime":"2026-01-31T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.947686 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.947740 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.947751 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.947768 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:49 crc kubenswrapper[4783]: I0131 09:05:49.947798 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:49Z","lastTransitionTime":"2026-01-31T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.050207 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.050247 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.050255 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.050270 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.050282 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:50Z","lastTransitionTime":"2026-01-31T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.151697 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.151729 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.151740 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.151753 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.151764 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:50Z","lastTransitionTime":"2026-01-31T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.253784 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.253816 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.253824 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.253840 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.253850 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:50Z","lastTransitionTime":"2026-01-31T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.355563 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.355606 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.355618 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.355633 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.355644 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:50Z","lastTransitionTime":"2026-01-31T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.457745 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.457784 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.457795 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.457809 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.457819 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:50Z","lastTransitionTime":"2026-01-31T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.560968 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.561016 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.561032 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.561281 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.561295 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:50Z","lastTransitionTime":"2026-01-31T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.640590 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 17:09:46.006143353 +0000 UTC Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.663645 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.663678 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.663688 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.663705 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.663716 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:50Z","lastTransitionTime":"2026-01-31T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.765939 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.765978 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.765987 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.766003 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.766015 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:50Z","lastTransitionTime":"2026-01-31T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.867918 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.867958 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.867969 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.867981 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.867995 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:50Z","lastTransitionTime":"2026-01-31T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.970418 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.970459 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.970470 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.970484 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:50 crc kubenswrapper[4783]: I0131 09:05:50.970495 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:50Z","lastTransitionTime":"2026-01-31T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.054859 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.054905 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.054913 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.054926 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.054934 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:51Z","lastTransitionTime":"2026-01-31T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:51 crc kubenswrapper[4783]: E0131 09:05:51.063852 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:51Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.066667 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.066698 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.066707 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.066727 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.066736 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:51Z","lastTransitionTime":"2026-01-31T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:51 crc kubenswrapper[4783]: E0131 09:05:51.077467 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:51Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.080544 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.080586 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.080597 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.080611 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.080625 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:51Z","lastTransitionTime":"2026-01-31T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:51 crc kubenswrapper[4783]: E0131 09:05:51.090697 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:51Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.093823 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.093870 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.093886 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.093905 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.093915 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:51Z","lastTransitionTime":"2026-01-31T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:51 crc kubenswrapper[4783]: E0131 09:05:51.103615 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:51Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.106260 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.106325 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.106337 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.106348 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.106357 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:51Z","lastTransitionTime":"2026-01-31T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:51 crc kubenswrapper[4783]: E0131 09:05:51.115198 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:51Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:51 crc kubenswrapper[4783]: E0131 09:05:51.115321 4783 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.116803 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.116842 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.116852 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.116869 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.116878 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:51Z","lastTransitionTime":"2026-01-31T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.218769 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.218802 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.218812 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.218826 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.218835 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:51Z","lastTransitionTime":"2026-01-31T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.320662 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.320698 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.320708 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.320723 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.320734 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:51Z","lastTransitionTime":"2026-01-31T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.422588 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.422618 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.422628 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.422641 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.422649 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:51Z","lastTransitionTime":"2026-01-31T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.524225 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.524264 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.524274 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.524287 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.524295 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:51Z","lastTransitionTime":"2026-01-31T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.626198 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.626228 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.626236 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.626248 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.626255 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:51Z","lastTransitionTime":"2026-01-31T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.640708 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 18:59:06.245471199 +0000 UTC Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.644998 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.645049 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:51 crc kubenswrapper[4783]: E0131 09:05:51.645095 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.645104 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.645118 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:51 crc kubenswrapper[4783]: E0131 09:05:51.645204 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:51 crc kubenswrapper[4783]: E0131 09:05:51.645253 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:51 crc kubenswrapper[4783]: E0131 09:05:51.645304 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.727694 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.727726 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.727736 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.727748 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.727756 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:51Z","lastTransitionTime":"2026-01-31T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.829421 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.829458 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.829468 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.829484 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.829495 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:51Z","lastTransitionTime":"2026-01-31T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.931625 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.931662 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.931672 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.931713 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:51 crc kubenswrapper[4783]: I0131 09:05:51.931725 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:51Z","lastTransitionTime":"2026-01-31T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.033843 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.033917 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.033929 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.033942 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.033952 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:52Z","lastTransitionTime":"2026-01-31T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.135253 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.135282 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.135292 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.135302 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.135310 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:52Z","lastTransitionTime":"2026-01-31T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.237005 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.237036 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.237045 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.237061 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.237071 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:52Z","lastTransitionTime":"2026-01-31T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.339129 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.339152 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.339180 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.339191 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.339200 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:52Z","lastTransitionTime":"2026-01-31T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.440437 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.440475 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.440486 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.440499 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.440508 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:52Z","lastTransitionTime":"2026-01-31T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.543852 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.543892 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.543901 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.543913 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.543921 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:52Z","lastTransitionTime":"2026-01-31T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.641142 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 11:46:52.740838968 +0000 UTC Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.645717 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.645746 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.645771 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.645784 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.645792 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:52Z","lastTransitionTime":"2026-01-31T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.747788 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.747815 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.747822 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.747832 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.747839 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:52Z","lastTransitionTime":"2026-01-31T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.849823 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.849877 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.849887 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.849927 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.849939 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:52Z","lastTransitionTime":"2026-01-31T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.952108 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.952135 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.952143 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.952154 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:52 crc kubenswrapper[4783]: I0131 09:05:52.952194 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:52Z","lastTransitionTime":"2026-01-31T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.053206 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.053239 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.053249 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.053259 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.053267 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:53Z","lastTransitionTime":"2026-01-31T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.154949 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.154977 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.154986 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.155000 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.155009 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:53Z","lastTransitionTime":"2026-01-31T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.257335 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.257415 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.257425 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.257439 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.257449 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:53Z","lastTransitionTime":"2026-01-31T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.359031 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.359067 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.359076 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.359087 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.359098 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:53Z","lastTransitionTime":"2026-01-31T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.460914 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.460947 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.460955 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.460968 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.460976 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:53Z","lastTransitionTime":"2026-01-31T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.562716 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.562753 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.562763 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.562793 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.562804 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:53Z","lastTransitionTime":"2026-01-31T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.641638 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 16:59:42.080888262 +0000 UTC Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.644889 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.644940 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:53 crc kubenswrapper[4783]: E0131 09:05:53.644982 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.644992 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.645017 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:53 crc kubenswrapper[4783]: E0131 09:05:53.645126 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:53 crc kubenswrapper[4783]: E0131 09:05:53.645221 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:53 crc kubenswrapper[4783]: E0131 09:05:53.645648 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.663874 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.663920 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.663932 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.663944 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.663952 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:53Z","lastTransitionTime":"2026-01-31T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.765660 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.765680 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.765689 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.765699 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.765707 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:53Z","lastTransitionTime":"2026-01-31T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.866988 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.867019 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.867029 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.867038 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.867046 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:53Z","lastTransitionTime":"2026-01-31T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.972030 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.972138 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.972273 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.972335 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:53 crc kubenswrapper[4783]: I0131 09:05:53.972402 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:53Z","lastTransitionTime":"2026-01-31T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.074050 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.074079 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.074087 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.074097 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.074105 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:54Z","lastTransitionTime":"2026-01-31T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.175620 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.175648 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.175657 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.175667 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.175674 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:54Z","lastTransitionTime":"2026-01-31T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.277150 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.277190 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.277199 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.277210 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.277218 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:54Z","lastTransitionTime":"2026-01-31T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.379118 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.379139 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.379147 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.379156 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.379188 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:54Z","lastTransitionTime":"2026-01-31T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.480617 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.480641 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.480651 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.480663 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.480672 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:54Z","lastTransitionTime":"2026-01-31T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.582365 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.582384 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.582392 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.582400 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.582407 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:54Z","lastTransitionTime":"2026-01-31T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.642688 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 04:32:57.797028885 +0000 UTC Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.684492 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.684511 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.684519 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.684527 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.684534 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:54Z","lastTransitionTime":"2026-01-31T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.787297 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.787325 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.787335 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.787348 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.787363 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:54Z","lastTransitionTime":"2026-01-31T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.889269 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.889308 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.889321 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.889342 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.889351 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:54Z","lastTransitionTime":"2026-01-31T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.992049 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.992083 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.992091 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.992103 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:54 crc kubenswrapper[4783]: I0131 09:05:54.992110 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:54Z","lastTransitionTime":"2026-01-31T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.093961 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.093984 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.093991 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.094001 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.094008 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:55Z","lastTransitionTime":"2026-01-31T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.195869 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.195896 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.195904 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.195914 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.195922 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:55Z","lastTransitionTime":"2026-01-31T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.297208 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.297230 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.297238 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.297247 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.297254 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:55Z","lastTransitionTime":"2026-01-31T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.399193 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.399213 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.399221 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.399231 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.399239 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:55Z","lastTransitionTime":"2026-01-31T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.501002 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.501040 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.501050 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.501064 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.501075 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:55Z","lastTransitionTime":"2026-01-31T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.602341 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.602368 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.602379 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.602390 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.602399 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:55Z","lastTransitionTime":"2026-01-31T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.643515 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 05:23:13.63395758 +0000 UTC Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.645278 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:55 crc kubenswrapper[4783]: E0131 09:05:55.645362 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.645415 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:55 crc kubenswrapper[4783]: E0131 09:05:55.645454 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.645645 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:55 crc kubenswrapper[4783]: E0131 09:05:55.645696 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.645972 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:55 crc kubenswrapper[4783]: E0131 09:05:55.646021 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.704057 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.704082 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.704092 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.704103 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.704111 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:55Z","lastTransitionTime":"2026-01-31T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.805763 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.805810 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.805820 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.805831 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.805839 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:55Z","lastTransitionTime":"2026-01-31T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.907718 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.907747 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.907758 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.907769 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:55 crc kubenswrapper[4783]: I0131 09:05:55.907777 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:55Z","lastTransitionTime":"2026-01-31T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.009135 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.009161 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.009190 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.009201 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.009208 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:56Z","lastTransitionTime":"2026-01-31T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.110998 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.111024 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.111032 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.111042 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.111049 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:56Z","lastTransitionTime":"2026-01-31T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.212374 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.212435 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.212444 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.212453 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.212460 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:56Z","lastTransitionTime":"2026-01-31T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.314336 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.314380 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.314391 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.314407 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.314417 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:56Z","lastTransitionTime":"2026-01-31T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.416348 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.416375 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.416384 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.416394 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.416400 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:56Z","lastTransitionTime":"2026-01-31T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.518931 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.518957 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.518965 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.518975 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.518983 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:56Z","lastTransitionTime":"2026-01-31T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.621201 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.621278 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.621288 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.621297 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.621305 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:56Z","lastTransitionTime":"2026-01-31T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.644516 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 11:33:12.221230791 +0000 UTC Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.723105 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.723132 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.723142 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.723154 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.723185 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:56Z","lastTransitionTime":"2026-01-31T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.825095 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.825119 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.825127 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.825137 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.825145 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:56Z","lastTransitionTime":"2026-01-31T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.927346 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.927379 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.927413 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.927426 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:56 crc kubenswrapper[4783]: I0131 09:05:56.927434 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:56Z","lastTransitionTime":"2026-01-31T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.028579 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.028607 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.028616 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.028628 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.028635 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:57Z","lastTransitionTime":"2026-01-31T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.130769 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.130798 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.130808 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.130817 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.130825 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:57Z","lastTransitionTime":"2026-01-31T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.233018 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.233077 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.233091 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.233115 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.233137 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:57Z","lastTransitionTime":"2026-01-31T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.335152 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.335207 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.335218 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.335232 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.335242 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:57Z","lastTransitionTime":"2026-01-31T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.436845 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.436887 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.436902 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.436921 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.436934 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:57Z","lastTransitionTime":"2026-01-31T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.538340 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.538371 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.538380 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.538394 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.538402 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:57Z","lastTransitionTime":"2026-01-31T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.639891 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.639956 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.639968 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.639984 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.639995 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:57Z","lastTransitionTime":"2026-01-31T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.645242 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 07:28:39.921156432 +0000 UTC Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.645400 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.645410 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.645468 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.645475 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:57 crc kubenswrapper[4783]: E0131 09:05:57.645549 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:57 crc kubenswrapper[4783]: E0131 09:05:57.645660 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:57 crc kubenswrapper[4783]: E0131 09:05:57.645705 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:57 crc kubenswrapper[4783]: E0131 09:05:57.645807 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.742282 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.742314 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.742325 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.742363 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.742373 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:57Z","lastTransitionTime":"2026-01-31T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.843925 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.843975 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.843987 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.844000 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.844008 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:57Z","lastTransitionTime":"2026-01-31T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.944984 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.945033 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.945044 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.945056 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:57 crc kubenswrapper[4783]: I0131 09:05:57.945064 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:57Z","lastTransitionTime":"2026-01-31T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.046849 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.046880 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.046890 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.046903 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.046910 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:58Z","lastTransitionTime":"2026-01-31T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.126749 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs\") pod \"network-metrics-daemon-xg6x2\" (UID: \"84961ed7-35f8-4e6a-987c-cabb84cf7268\") " pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:58 crc kubenswrapper[4783]: E0131 09:05:58.126875 4783 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:05:58 crc kubenswrapper[4783]: E0131 09:05:58.126925 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs podName:84961ed7-35f8-4e6a-987c-cabb84cf7268 nodeName:}" failed. No retries permitted until 2026-01-31 09:06:30.126905533 +0000 UTC m=+100.795588991 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs") pod "network-metrics-daemon-xg6x2" (UID: "84961ed7-35f8-4e6a-987c-cabb84cf7268") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.148586 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.148613 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.148623 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.148635 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.148643 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:58Z","lastTransitionTime":"2026-01-31T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.250604 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.250626 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.250635 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.250645 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.250652 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:58Z","lastTransitionTime":"2026-01-31T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.352425 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.352445 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.352453 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.352461 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.352469 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:58Z","lastTransitionTime":"2026-01-31T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.453533 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.453715 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.453778 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.453844 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.453958 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:58Z","lastTransitionTime":"2026-01-31T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.555675 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.555697 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.555708 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.555720 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.555729 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:58Z","lastTransitionTime":"2026-01-31T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.645498 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 10:14:01.491148325 +0000 UTC Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.657644 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.657746 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.657814 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.657872 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.657932 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:58Z","lastTransitionTime":"2026-01-31T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.760184 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.760332 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.760400 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.760469 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.760533 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:58Z","lastTransitionTime":"2026-01-31T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.863243 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.863297 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.863311 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.863337 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.863351 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:58Z","lastTransitionTime":"2026-01-31T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.908558 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8q8td_0b5ffe9c-191a-4902-8e13-6a869f158784/kube-multus/0.log" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.908638 4783 generic.go:334] "Generic (PLEG): container finished" podID="0b5ffe9c-191a-4902-8e13-6a869f158784" containerID="11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c" exitCode=1 Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.908686 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8q8td" event={"ID":"0b5ffe9c-191a-4902-8e13-6a869f158784","Type":"ContainerDied","Data":"11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c"} Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.909307 4783 scope.go:117] "RemoveContainer" containerID="11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.918169 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.927262 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.942904 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:34Z\\\",\\\"message\\\":\\\"Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266650 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266658 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266303 6428 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-vr882\\\\nI0131 09:05:34.266332 6428 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-6h2bb\\\\nI0131 09:05:34.266712 6428 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-6h2bb in node crc\\\\nI0131 09:05:34\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.951105 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.958139 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.964848 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.964882 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.964891 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.964908 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.964919 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:58Z","lastTransitionTime":"2026-01-31T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.966481 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.974499 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.983628 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:58 crc kubenswrapper[4783]: I0131 09:05:58.991749 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.001462 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.011439 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc94916-57ef-4e93-810b-8d1772d40130\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acd5c1f71bcf95041d31c6c0858e8f6b02aac2d44e208a25228924199cfa8527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab2f851c6a3148330f3e19352e904c099dc3b3e68f664cbc62aec049e50e5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae0cca9ef3d64d4323f775d56154dd4d95078b142c429e8431580734b7fa739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.022103 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.030888 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.041265 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.051391 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:58Z\\\",\\\"message\\\":\\\"2026-01-31T09:05:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287\\\\n2026-01-31T09:05:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287 to /host/opt/cni/bin/\\\\n2026-01-31T09:05:13Z [verbose] multus-daemon started\\\\n2026-01-31T09:05:13Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:05:58Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.058801 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.070634 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.070673 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.070684 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.070697 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.070707 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:59Z","lastTransitionTime":"2026-01-31T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.072786 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.082258 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.172702 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.172791 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.172869 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.172941 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.173009 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:59Z","lastTransitionTime":"2026-01-31T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.275701 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.275736 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.275745 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.275760 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.275769 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:59Z","lastTransitionTime":"2026-01-31T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.377375 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.377401 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.377412 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.377424 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.377431 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:59Z","lastTransitionTime":"2026-01-31T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.479180 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.479208 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.479217 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.479230 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.479257 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:59Z","lastTransitionTime":"2026-01-31T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.580911 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.580935 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.580945 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.580957 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.580965 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:59Z","lastTransitionTime":"2026-01-31T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.644794 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.644850 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.644795 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:05:59 crc kubenswrapper[4783]: E0131 09:05:59.644896 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:05:59 crc kubenswrapper[4783]: E0131 09:05:59.644993 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:05:59 crc kubenswrapper[4783]: E0131 09:05:59.645038 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.645061 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:05:59 crc kubenswrapper[4783]: E0131 09:05:59.645229 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.646058 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 21:46:38.244574764 +0000 UTC Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.656850 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.665997 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.674648 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.682016 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc94916-57ef-4e93-810b-8d1772d40130\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acd5c1f71bcf95041d31c6c0858e8f6b02aac2d44e208a25228924199cfa8527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab2f851c6a3148330f3e19352e904c099dc3b3e68f664cbc62aec049e50e5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae0cca9ef3d64d4323f775d56154dd4d95078b142c429e8431580734b7fa739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.683116 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.683143 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.683152 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.683196 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.683211 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:59Z","lastTransitionTime":"2026-01-31T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.691326 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.704084 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.716609 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.724792 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:58Z\\\",\\\"message\\\":\\\"2026-01-31T09:05:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287\\\\n2026-01-31T09:05:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287 to /host/opt/cni/bin/\\\\n2026-01-31T09:05:13Z [verbose] multus-daemon started\\\\n2026-01-31T09:05:13Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:05:58Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.732726 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.747966 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.757299 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.765039 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.773801 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.784815 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.784843 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.784852 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.784885 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.784897 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:59Z","lastTransitionTime":"2026-01-31T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.786438 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:34Z\\\",\\\"message\\\":\\\"Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266650 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266658 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266303 6428 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-vr882\\\\nI0131 09:05:34.266332 6428 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-6h2bb\\\\nI0131 09:05:34.266712 6428 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-6h2bb in node crc\\\\nI0131 09:05:34\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.794456 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.802092 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.810733 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.819600 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.886728 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.886757 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.886767 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.886781 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.886789 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:59Z","lastTransitionTime":"2026-01-31T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.911765 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8q8td_0b5ffe9c-191a-4902-8e13-6a869f158784/kube-multus/0.log" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.911821 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8q8td" event={"ID":"0b5ffe9c-191a-4902-8e13-6a869f158784","Type":"ContainerStarted","Data":"6549acc80da6fb5d0a453b1ca130d10ba98d32d99b730e3ba9ce4cb1e68e87d7"} Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.920260 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.928907 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.937155 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.946840 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.955359 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.963333 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc94916-57ef-4e93-810b-8d1772d40130\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acd5c1f71bcf95041d31c6c0858e8f6b02aac2d44e208a25228924199cfa8527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab2f851c6a3148330f3e19352e904c099dc3b3e68f664cbc62aec049e50e5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae0cca9ef3d64d4323f775d56154dd4d95078b142c429e8431580734b7fa739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.975776 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.983493 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.988336 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.988368 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.988400 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.988413 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.988422 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:05:59Z","lastTransitionTime":"2026-01-31T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:05:59 crc kubenswrapper[4783]: I0131 09:05:59.994149 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:05:59Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.002812 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6549acc80da6fb5d0a453b1ca130d10ba98d32d99b730e3ba9ce4cb1e68e87d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:58Z\\\",\\\"message\\\":\\\"2026-01-31T09:05:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287\\\\n2026-01-31T09:05:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287 to /host/opt/cni/bin/\\\\n2026-01-31T09:05:13Z [verbose] multus-daemon started\\\\n2026-01-31T09:05:13Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:05:58Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:00Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.010481 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:00Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.018285 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:00Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.026753 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:00Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.033754 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:00Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.041618 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:00Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.054607 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:34Z\\\",\\\"message\\\":\\\"Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266650 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266658 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266303 6428 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-vr882\\\\nI0131 09:05:34.266332 6428 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-6h2bb\\\\nI0131 09:05:34.266712 6428 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-6h2bb in node crc\\\\nI0131 09:05:34\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:00Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.061448 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:00Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.068041 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:00Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.090766 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.090810 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.090822 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.090841 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.090855 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:00Z","lastTransitionTime":"2026-01-31T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.197117 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.197274 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.197444 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.197722 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.197744 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:00Z","lastTransitionTime":"2026-01-31T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.299849 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.299879 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.299889 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.299902 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.299911 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:00Z","lastTransitionTime":"2026-01-31T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.402158 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.402285 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.402345 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.402416 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.402478 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:00Z","lastTransitionTime":"2026-01-31T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.504096 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.504122 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.504132 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.504144 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.504153 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:00Z","lastTransitionTime":"2026-01-31T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.606222 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.606252 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.606261 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.606277 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.606287 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:00Z","lastTransitionTime":"2026-01-31T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.646384 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 08:32:53.72285995 +0000 UTC Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.708354 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.708376 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.708384 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.708394 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.708403 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:00Z","lastTransitionTime":"2026-01-31T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.809796 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.809828 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.809838 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.809851 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.809859 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:00Z","lastTransitionTime":"2026-01-31T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.910898 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.910920 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.910929 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.910940 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:00 crc kubenswrapper[4783]: I0131 09:06:00.910949 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:00Z","lastTransitionTime":"2026-01-31T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.012606 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.012652 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.012664 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.012686 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.012700 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:01Z","lastTransitionTime":"2026-01-31T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.114570 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.114615 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.114625 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.114638 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.114648 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:01Z","lastTransitionTime":"2026-01-31T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.216686 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.216726 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.216741 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.216759 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.216771 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:01Z","lastTransitionTime":"2026-01-31T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.318939 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.318977 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.318992 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.319006 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.319015 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:01Z","lastTransitionTime":"2026-01-31T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.420773 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.420823 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.420836 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.420851 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.420863 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:01Z","lastTransitionTime":"2026-01-31T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.492805 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.492837 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.492851 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.492864 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.492873 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:01Z","lastTransitionTime":"2026-01-31T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:01 crc kubenswrapper[4783]: E0131 09:06:01.505390 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:01Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.507685 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.507710 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.507717 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.507728 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.507735 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:01Z","lastTransitionTime":"2026-01-31T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:01 crc kubenswrapper[4783]: E0131 09:06:01.516106 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:01Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.518304 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.518328 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.518338 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.518350 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.518375 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:01Z","lastTransitionTime":"2026-01-31T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:01 crc kubenswrapper[4783]: E0131 09:06:01.526647 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:01Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.528961 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.528990 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.528999 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.529009 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.529019 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:01Z","lastTransitionTime":"2026-01-31T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:01 crc kubenswrapper[4783]: E0131 09:06:01.537386 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:01Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.539596 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.539624 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.539634 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.539649 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.539659 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:01Z","lastTransitionTime":"2026-01-31T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:01 crc kubenswrapper[4783]: E0131 09:06:01.547263 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:01Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:01 crc kubenswrapper[4783]: E0131 09:06:01.547378 4783 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.548442 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.548474 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.548484 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.548497 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.548505 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:01Z","lastTransitionTime":"2026-01-31T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.645500 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.645542 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.645605 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:01 crc kubenswrapper[4783]: E0131 09:06:01.645750 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.645795 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:01 crc kubenswrapper[4783]: E0131 09:06:01.645856 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:01 crc kubenswrapper[4783]: E0131 09:06:01.645968 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:01 crc kubenswrapper[4783]: E0131 09:06:01.646436 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.646460 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 00:00:06.901865042 +0000 UTC Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.646748 4783 scope.go:117] "RemoveContainer" containerID="f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.650327 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.650445 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.650512 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.650604 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.650735 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:01Z","lastTransitionTime":"2026-01-31T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.752935 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.752957 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.752966 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.752983 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.752993 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:01Z","lastTransitionTime":"2026-01-31T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.855077 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.855112 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.855122 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.855136 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.855146 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:01Z","lastTransitionTime":"2026-01-31T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.919880 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovnkube-controller/2.log" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.922963 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerStarted","Data":"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3"} Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.923394 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.935418 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc94916-57ef-4e93-810b-8d1772d40130\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acd5c1f71bcf95041d31c6c0858e8f6b02aac2d44e208a25228924199cfa8527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab2f851c6a3148330f3e19352e904c099dc3b3e68f664cbc62aec049e50e5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae0cca9ef3d64d4323f775d56154dd4d95078b142c429e8431580734b7fa739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:01Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.956628 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:01Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.957608 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.957634 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.957645 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.957661 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.957672 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:01Z","lastTransitionTime":"2026-01-31T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.965890 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:01Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.975513 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:01Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.984934 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6549acc80da6fb5d0a453b1ca130d10ba98d32d99b730e3ba9ce4cb1e68e87d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:58Z\\\",\\\"message\\\":\\\"2026-01-31T09:05:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287\\\\n2026-01-31T09:05:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287 to /host/opt/cni/bin/\\\\n2026-01-31T09:05:13Z [verbose] multus-daemon started\\\\n2026-01-31T09:05:13Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:05:58Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:01Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:01 crc kubenswrapper[4783]: I0131 09:06:01.993028 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:01Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.009837 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.020793 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.033001 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.040721 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.054534 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:34Z\\\",\\\"message\\\":\\\"Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266650 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266658 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266303 6428 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-vr882\\\\nI0131 09:05:34.266332 6428 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-6h2bb\\\\nI0131 09:05:34.266712 6428 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-6h2bb in node crc\\\\nI0131 09:05:34\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.059463 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.059495 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.059504 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.059516 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.059525 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:02Z","lastTransitionTime":"2026-01-31T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.062312 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.068634 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.075646 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.083730 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.093336 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.101594 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.109923 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.161291 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.161323 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.161334 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.161352 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.161363 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:02Z","lastTransitionTime":"2026-01-31T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.263211 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.263246 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.263256 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.263271 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.263279 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:02Z","lastTransitionTime":"2026-01-31T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.365026 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.365047 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.365055 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.365064 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.365073 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:02Z","lastTransitionTime":"2026-01-31T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.467083 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.467112 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.467120 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.467136 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.467146 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:02Z","lastTransitionTime":"2026-01-31T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.568986 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.569120 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.569209 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.569290 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.569340 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:02Z","lastTransitionTime":"2026-01-31T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.646534 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 18:49:23.681337627 +0000 UTC Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.671456 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.671483 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.671493 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.671507 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.671514 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:02Z","lastTransitionTime":"2026-01-31T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.773541 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.773567 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.773578 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.773601 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.773608 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:02Z","lastTransitionTime":"2026-01-31T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.875150 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.875183 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.875192 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.875202 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.875209 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:02Z","lastTransitionTime":"2026-01-31T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.926699 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovnkube-controller/3.log" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.927219 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovnkube-controller/2.log" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.929353 4783 generic.go:334] "Generic (PLEG): container finished" podID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerID="7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3" exitCode=1 Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.929388 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerDied","Data":"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3"} Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.929435 4783 scope.go:117] "RemoveContainer" containerID="f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.929818 4783 scope.go:117] "RemoveContainer" containerID="7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3" Jan 31 09:06:02 crc kubenswrapper[4783]: E0131 09:06:02.929949 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.948121 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.960225 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6549acc80da6fb5d0a453b1ca130d10ba98d32d99b730e3ba9ce4cb1e68e87d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:58Z\\\",\\\"message\\\":\\\"2026-01-31T09:05:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287\\\\n2026-01-31T09:05:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287 to /host/opt/cni/bin/\\\\n2026-01-31T09:05:13Z [verbose] multus-daemon started\\\\n2026-01-31T09:05:13Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:05:58Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.969975 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.976442 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.976478 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.976487 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.976502 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.976510 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:02Z","lastTransitionTime":"2026-01-31T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.987620 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:02 crc kubenswrapper[4783]: I0131 09:06:02.996401 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.004206 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.011440 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.023762 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68c5b16b68d485c8d6e516798a459f6d0ca947053c9cf413035215911c8d8c7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:34Z\\\",\\\"message\\\":\\\"Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266650 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266658 6428 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:05:34.266303 6428 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-vr882\\\\nI0131 09:05:34.266332 6428 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-6h2bb\\\\nI0131 09:05:34.266712 6428 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-6h2bb in node crc\\\\nI0131 09:05:34\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:06:02Z\\\",\\\"message\\\":\\\"e]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:06:02.334513 6835 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:06:02.334578 6835 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0131 09:06:02.334680 6835 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0131 09:06:02.334712 6835 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:06:02.334746 6835 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:06:02.334824 6835 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.031125 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.038809 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.046893 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.054698 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.063532 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.071822 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.078261 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.078282 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.078291 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.078302 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.078310 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:03Z","lastTransitionTime":"2026-01-31T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.080920 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.088472 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc94916-57ef-4e93-810b-8d1772d40130\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acd5c1f71bcf95041d31c6c0858e8f6b02aac2d44e208a25228924199cfa8527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab2f851c6a3148330f3e19352e904c099dc3b3e68f664cbc62aec049e50e5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae0cca9ef3d64d4323f775d56154dd4d95078b142c429e8431580734b7fa739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.098463 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.107412 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.179683 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.179709 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.179720 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.179735 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.179744 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:03Z","lastTransitionTime":"2026-01-31T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.281915 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.281940 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.281949 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.281959 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.281968 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:03Z","lastTransitionTime":"2026-01-31T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.383658 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.383679 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.383687 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.383696 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.383704 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:03Z","lastTransitionTime":"2026-01-31T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.485477 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.485518 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.485530 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.485545 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.485556 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:03Z","lastTransitionTime":"2026-01-31T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.586807 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.586831 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.586838 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.586847 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.586855 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:03Z","lastTransitionTime":"2026-01-31T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.644891 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.644953 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:03 crc kubenswrapper[4783]: E0131 09:06:03.644984 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.645014 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:03 crc kubenswrapper[4783]: E0131 09:06:03.645108 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:03 crc kubenswrapper[4783]: E0131 09:06:03.645210 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.645254 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:03 crc kubenswrapper[4783]: E0131 09:06:03.645305 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.646872 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 14:57:04.720621737 +0000 UTC Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.688847 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.688924 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.688943 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.688968 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.688986 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:03Z","lastTransitionTime":"2026-01-31T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.790845 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.790899 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.790911 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.790928 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.790944 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:03Z","lastTransitionTime":"2026-01-31T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.892877 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.892913 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.892922 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.892934 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.892943 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:03Z","lastTransitionTime":"2026-01-31T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.934056 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovnkube-controller/3.log" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.936769 4783 scope.go:117] "RemoveContainer" containerID="7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3" Jan 31 09:06:03 crc kubenswrapper[4783]: E0131 09:06:03.937035 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.947664 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.956565 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.964376 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc94916-57ef-4e93-810b-8d1772d40130\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acd5c1f71bcf95041d31c6c0858e8f6b02aac2d44e208a25228924199cfa8527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab2f851c6a3148330f3e19352e904c099dc3b3e68f664cbc62aec049e50e5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae0cca9ef3d64d4323f775d56154dd4d95078b142c429e8431580734b7fa739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.977904 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.986036 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.995083 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.995127 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.995145 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.995179 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.995192 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:03Z","lastTransitionTime":"2026-01-31T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:03 crc kubenswrapper[4783]: I0131 09:06:03.996010 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:03Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.004866 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6549acc80da6fb5d0a453b1ca130d10ba98d32d99b730e3ba9ce4cb1e68e87d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:58Z\\\",\\\"message\\\":\\\"2026-01-31T09:05:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287\\\\n2026-01-31T09:05:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287 to /host/opt/cni/bin/\\\\n2026-01-31T09:05:13Z [verbose] multus-daemon started\\\\n2026-01-31T09:05:13Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:05:58Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:04Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.015589 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:04Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.023135 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:04Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.030999 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:04Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.037767 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:04Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.044785 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:04Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.057711 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:06:02Z\\\",\\\"message\\\":\\\"e]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:06:02.334513 6835 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:06:02.334578 6835 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0131 09:06:02.334680 6835 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0131 09:06:02.334712 6835 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:06:02.334746 6835 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:06:02.334824 6835 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:06:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:04Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.064311 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:04Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.071054 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:04Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.079763 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:04Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.087957 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:04Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.095556 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:04Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.096648 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.096672 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.096682 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.096712 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.096721 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:04Z","lastTransitionTime":"2026-01-31T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.198521 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.198576 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.198591 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.198630 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.198644 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:04Z","lastTransitionTime":"2026-01-31T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.300521 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.300554 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.300563 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.300575 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.300586 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:04Z","lastTransitionTime":"2026-01-31T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.402318 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.402345 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.402353 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.402364 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.402374 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:04Z","lastTransitionTime":"2026-01-31T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.503886 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.503943 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.503955 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.503970 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.503980 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:04Z","lastTransitionTime":"2026-01-31T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.605823 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.605847 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.605856 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.605870 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.605879 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:04Z","lastTransitionTime":"2026-01-31T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.647746 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 15:48:50.446423127 +0000 UTC Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.707486 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.707521 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.707531 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.707543 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.707552 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:04Z","lastTransitionTime":"2026-01-31T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.809596 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.809651 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.809661 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.809671 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.809678 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:04Z","lastTransitionTime":"2026-01-31T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.911870 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.911914 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.911928 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.911947 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:04 crc kubenswrapper[4783]: I0131 09:06:04.911963 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:04Z","lastTransitionTime":"2026-01-31T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.014537 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.014569 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.014580 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.014591 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.014613 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:05Z","lastTransitionTime":"2026-01-31T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.116666 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.116715 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.116740 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.116751 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.116760 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:05Z","lastTransitionTime":"2026-01-31T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.218738 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.218773 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.218783 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.218799 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.218807 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:05Z","lastTransitionTime":"2026-01-31T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.321121 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.321150 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.321179 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.321203 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.321210 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:05Z","lastTransitionTime":"2026-01-31T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.423410 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.423443 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.423453 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.423470 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.423481 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:05Z","lastTransitionTime":"2026-01-31T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.525222 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.525255 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.525264 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.525275 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.525283 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:05Z","lastTransitionTime":"2026-01-31T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.626573 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.626598 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.626618 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.626628 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.626636 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:05Z","lastTransitionTime":"2026-01-31T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.645176 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.645212 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:05 crc kubenswrapper[4783]: E0131 09:06:05.645281 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.645324 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.645352 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:05 crc kubenswrapper[4783]: E0131 09:06:05.645432 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:05 crc kubenswrapper[4783]: E0131 09:06:05.645660 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:05 crc kubenswrapper[4783]: E0131 09:06:05.645777 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.648028 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 14:42:47.995379502 +0000 UTC Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.728745 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.728773 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.728782 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.728791 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.728798 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:05Z","lastTransitionTime":"2026-01-31T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.830398 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.830533 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.830598 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.830682 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.830744 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:05Z","lastTransitionTime":"2026-01-31T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.932290 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.932319 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.932327 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.932336 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:05 crc kubenswrapper[4783]: I0131 09:06:05.932343 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:05Z","lastTransitionTime":"2026-01-31T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.034329 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.034359 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.034367 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.034376 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.034385 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:06Z","lastTransitionTime":"2026-01-31T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.135742 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.135766 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.135775 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.135785 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.135792 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:06Z","lastTransitionTime":"2026-01-31T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.237730 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.237753 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.237761 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.237769 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.237777 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:06Z","lastTransitionTime":"2026-01-31T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.339639 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.339691 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.339702 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.339736 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.339746 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:06Z","lastTransitionTime":"2026-01-31T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.441564 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.441634 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.441647 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.441669 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.441689 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:06Z","lastTransitionTime":"2026-01-31T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.543355 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.543393 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.543404 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.543419 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.543428 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:06Z","lastTransitionTime":"2026-01-31T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.644680 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.644717 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.644727 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.644737 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.644745 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:06Z","lastTransitionTime":"2026-01-31T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.648916 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 17:15:44.288235695 +0000 UTC Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.746148 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.746197 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.746207 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.746220 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.746230 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:06Z","lastTransitionTime":"2026-01-31T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.847966 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.847991 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.848002 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.848012 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.848019 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:06Z","lastTransitionTime":"2026-01-31T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.949727 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.949756 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.949766 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.949777 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:06 crc kubenswrapper[4783]: I0131 09:06:06.949785 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:06Z","lastTransitionTime":"2026-01-31T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.050894 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.050922 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.050933 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.050944 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.050953 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:07Z","lastTransitionTime":"2026-01-31T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.152788 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.152811 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.152821 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.152832 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.152840 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:07Z","lastTransitionTime":"2026-01-31T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.254951 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.254986 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.254994 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.255008 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.255019 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:07Z","lastTransitionTime":"2026-01-31T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.356378 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.356405 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.356414 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.356425 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.356433 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:07Z","lastTransitionTime":"2026-01-31T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.459138 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.459175 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.459185 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.459195 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.459202 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:07Z","lastTransitionTime":"2026-01-31T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.560373 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.560396 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.560405 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.560415 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.560421 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:07Z","lastTransitionTime":"2026-01-31T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.645216 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.645260 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:07 crc kubenswrapper[4783]: E0131 09:06:07.645302 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.645316 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.645216 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:07 crc kubenswrapper[4783]: E0131 09:06:07.645394 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:07 crc kubenswrapper[4783]: E0131 09:06:07.645449 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:07 crc kubenswrapper[4783]: E0131 09:06:07.645514 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.649998 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 20:32:26.472811984 +0000 UTC Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.662061 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.662089 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.662098 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.662109 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.662117 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:07Z","lastTransitionTime":"2026-01-31T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.763773 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.763803 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.763811 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.763823 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.763834 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:07Z","lastTransitionTime":"2026-01-31T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.866102 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.866233 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.866320 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.866392 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.866466 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:07Z","lastTransitionTime":"2026-01-31T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.968305 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.968455 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.968529 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.968597 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:07 crc kubenswrapper[4783]: I0131 09:06:07.968666 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:07Z","lastTransitionTime":"2026-01-31T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.069975 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.070143 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.070233 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.070296 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.070346 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:08Z","lastTransitionTime":"2026-01-31T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.172418 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.172508 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.172568 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.172639 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.172697 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:08Z","lastTransitionTime":"2026-01-31T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.274663 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.275003 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.275071 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.275148 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.275242 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:08Z","lastTransitionTime":"2026-01-31T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.377479 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.377520 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.377532 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.377547 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.377556 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:08Z","lastTransitionTime":"2026-01-31T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.479003 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.479147 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.479274 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.479342 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.479409 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:08Z","lastTransitionTime":"2026-01-31T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.581655 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.581688 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.581697 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.581712 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.581722 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:08Z","lastTransitionTime":"2026-01-31T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.650796 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 10:05:54.123361142 +0000 UTC Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.683388 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.683413 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.683422 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.683431 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.683440 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:08Z","lastTransitionTime":"2026-01-31T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.785614 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.785845 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.785918 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.785987 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.786048 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:08Z","lastTransitionTime":"2026-01-31T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.888504 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.888530 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.888539 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.888553 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.888564 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:08Z","lastTransitionTime":"2026-01-31T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.990325 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.990383 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.990393 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.990408 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:08 crc kubenswrapper[4783]: I0131 09:06:08.990418 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:08Z","lastTransitionTime":"2026-01-31T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.092791 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.092858 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.092876 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.092899 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.092914 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:09Z","lastTransitionTime":"2026-01-31T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.194618 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.194651 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.194659 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.194669 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.194675 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:09Z","lastTransitionTime":"2026-01-31T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.296281 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.296322 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.296332 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.296347 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.296357 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:09Z","lastTransitionTime":"2026-01-31T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.398159 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.398245 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.398260 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.398282 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.398294 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:09Z","lastTransitionTime":"2026-01-31T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.499593 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.499614 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.499623 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.499641 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.499648 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:09Z","lastTransitionTime":"2026-01-31T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.601071 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.601110 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.601122 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.601138 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.601148 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:09Z","lastTransitionTime":"2026-01-31T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.645675 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.645782 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.645806 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:09 crc kubenswrapper[4783]: E0131 09:06:09.645927 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.645972 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:09 crc kubenswrapper[4783]: E0131 09:06:09.646110 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:09 crc kubenswrapper[4783]: E0131 09:06:09.646186 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:09 crc kubenswrapper[4783]: E0131 09:06:09.646227 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.651691 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 09:24:05.411882175 +0000 UTC Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.660818 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.671123 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.680790 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.688898 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6549acc80da6fb5d0a453b1ca130d10ba98d32d99b730e3ba9ce4cb1e68e87d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:58Z\\\",\\\"message\\\":\\\"2026-01-31T09:05:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287\\\\n2026-01-31T09:05:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287 to /host/opt/cni/bin/\\\\n2026-01-31T09:05:13Z [verbose] multus-daemon started\\\\n2026-01-31T09:05:13Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:05:58Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.696375 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.702204 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.702235 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.702246 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.702260 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.702273 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:09Z","lastTransitionTime":"2026-01-31T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.709102 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.717233 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.724500 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.733237 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.745657 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:06:02Z\\\",\\\"message\\\":\\\"e]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:06:02.334513 6835 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:06:02.334578 6835 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0131 09:06:02.334680 6835 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0131 09:06:02.334712 6835 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:06:02.334746 6835 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:06:02.334824 6835 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:06:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.752245 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.758857 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.767008 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.775377 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.783855 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.793248 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.801880 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.804017 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.804041 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.804051 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.804064 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.804074 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:09Z","lastTransitionTime":"2026-01-31T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.810791 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc94916-57ef-4e93-810b-8d1772d40130\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acd5c1f71bcf95041d31c6c0858e8f6b02aac2d44e208a25228924199cfa8527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab2f851c6a3148330f3e19352e904c099dc3b3e68f664cbc62aec049e50e5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae0cca9ef3d64d4323f775d56154dd4d95078b142c429e8431580734b7fa739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:09Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.905677 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.905702 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.905711 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.905720 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:09 crc kubenswrapper[4783]: I0131 09:06:09.905728 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:09Z","lastTransitionTime":"2026-01-31T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.007796 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.007832 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.007842 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.007854 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.007862 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:10Z","lastTransitionTime":"2026-01-31T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.109649 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.109677 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.109685 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.109700 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.109710 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:10Z","lastTransitionTime":"2026-01-31T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.211863 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.211885 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.211896 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.211909 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.211918 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:10Z","lastTransitionTime":"2026-01-31T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.314142 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.314235 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.314251 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.314274 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.314291 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:10Z","lastTransitionTime":"2026-01-31T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.415850 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.415897 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.415922 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.415933 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.415940 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:10Z","lastTransitionTime":"2026-01-31T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.517676 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.517733 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.517743 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.517755 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.517765 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:10Z","lastTransitionTime":"2026-01-31T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.619599 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.619647 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.619663 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.619678 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.619691 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:10Z","lastTransitionTime":"2026-01-31T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.652682 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 05:01:11.227965683 +0000 UTC Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.722129 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.722178 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.722189 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.722202 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.722209 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:10Z","lastTransitionTime":"2026-01-31T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.823475 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.823495 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.823503 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.823513 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.823520 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:10Z","lastTransitionTime":"2026-01-31T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.925213 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.925240 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.925248 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.925258 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:10 crc kubenswrapper[4783]: I0131 09:06:10.925265 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:10Z","lastTransitionTime":"2026-01-31T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.026789 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.026818 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.026845 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.026856 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.026863 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:11Z","lastTransitionTime":"2026-01-31T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.128067 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.128091 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.128099 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.128108 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.128116 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:11Z","lastTransitionTime":"2026-01-31T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.229334 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.229360 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.229370 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.229379 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.229387 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:11Z","lastTransitionTime":"2026-01-31T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.330956 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.330982 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.330990 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.331018 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.331027 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:11Z","lastTransitionTime":"2026-01-31T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.432483 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.432505 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.432513 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.432521 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.432528 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:11Z","lastTransitionTime":"2026-01-31T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.529330 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.529486 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:07:15.529468313 +0000 UTC m=+146.198151781 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.534872 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.534901 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.534928 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.534940 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.534947 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:11Z","lastTransitionTime":"2026-01-31T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.630548 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.630582 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.630619 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.630638 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.630673 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.630694 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.630707 4783 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.630724 4783 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.630745 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:07:15.63073484 +0000 UTC m=+146.299418318 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.630763 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:07:15.63075539 +0000 UTC m=+146.299438857 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.630783 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.630793 4783 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.630801 4783 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.630823 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:07:15.630814741 +0000 UTC m=+146.299498208 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.630852 4783 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.630875 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:07:15.630869534 +0000 UTC m=+146.299552992 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.636447 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.636471 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.636480 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.636492 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.636499 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:11Z","lastTransitionTime":"2026-01-31T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.644947 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.644983 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.645058 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.645104 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.645208 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.645294 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.645369 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.645425 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.653022 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 00:57:51.247819882 +0000 UTC Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.701786 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.701811 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.701820 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.701830 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.701837 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:11Z","lastTransitionTime":"2026-01-31T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.712693 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:11Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.715439 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.715468 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.715478 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.715507 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.715517 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:11Z","lastTransitionTime":"2026-01-31T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.723698 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:11Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.725940 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.725967 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.725978 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.726008 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.726018 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:11Z","lastTransitionTime":"2026-01-31T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.733582 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:11Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.736932 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.736993 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.737006 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.737029 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.737040 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:11Z","lastTransitionTime":"2026-01-31T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.745306 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:11Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.747455 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.747490 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.747520 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.747533 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.747543 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:11Z","lastTransitionTime":"2026-01-31T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.760041 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:11Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:11 crc kubenswrapper[4783]: E0131 09:06:11.760156 4783 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.761042 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.761073 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.761084 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.761097 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.761105 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:11Z","lastTransitionTime":"2026-01-31T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.863360 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.863389 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.863400 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.863412 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.863422 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:11Z","lastTransitionTime":"2026-01-31T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.965379 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.965443 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.965453 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.965464 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:11 crc kubenswrapper[4783]: I0131 09:06:11.965471 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:11Z","lastTransitionTime":"2026-01-31T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.067536 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.067558 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.067567 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.067576 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.067584 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:12Z","lastTransitionTime":"2026-01-31T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.169991 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.170041 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.170057 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.170074 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.170085 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:12Z","lastTransitionTime":"2026-01-31T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.271341 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.271374 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.271389 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.271406 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.271420 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:12Z","lastTransitionTime":"2026-01-31T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.373586 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.373614 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.373623 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.373634 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.373643 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:12Z","lastTransitionTime":"2026-01-31T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.474875 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.474914 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.474925 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.474941 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.474950 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:12Z","lastTransitionTime":"2026-01-31T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.576874 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.576909 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.576920 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.576934 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.576945 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:12Z","lastTransitionTime":"2026-01-31T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.654003 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 11:43:12.640674614 +0000 UTC Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.678597 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.678632 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.678643 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.678663 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.678672 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:12Z","lastTransitionTime":"2026-01-31T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.780507 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.780567 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.780579 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.780602 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.780616 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:12Z","lastTransitionTime":"2026-01-31T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.883738 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.883775 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.883786 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.883799 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.883809 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:12Z","lastTransitionTime":"2026-01-31T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.985581 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.985627 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.985638 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.985669 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:12 crc kubenswrapper[4783]: I0131 09:06:12.985683 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:12Z","lastTransitionTime":"2026-01-31T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.087358 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.087394 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.087403 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.087416 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.087424 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:13Z","lastTransitionTime":"2026-01-31T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.189001 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.189038 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.189048 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.189060 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.189068 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:13Z","lastTransitionTime":"2026-01-31T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.291105 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.291147 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.291158 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.291202 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.291218 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:13Z","lastTransitionTime":"2026-01-31T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.393616 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.393659 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.393670 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.393682 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.393690 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:13Z","lastTransitionTime":"2026-01-31T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.495527 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.495565 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.495575 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.495590 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.495601 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:13Z","lastTransitionTime":"2026-01-31T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.598014 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.598045 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.598053 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.598067 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.598077 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:13Z","lastTransitionTime":"2026-01-31T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.645606 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.645670 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.645717 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:13 crc kubenswrapper[4783]: E0131 09:06:13.645828 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.645882 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:13 crc kubenswrapper[4783]: E0131 09:06:13.645999 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:13 crc kubenswrapper[4783]: E0131 09:06:13.646072 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:13 crc kubenswrapper[4783]: E0131 09:06:13.646215 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.654595 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 10:23:19.896998642 +0000 UTC Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.699720 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.699753 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.699761 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.699770 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.699777 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:13Z","lastTransitionTime":"2026-01-31T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.800963 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.800994 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.801002 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.801026 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.801034 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:13Z","lastTransitionTime":"2026-01-31T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.903121 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.903157 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.903183 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.903198 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:13 crc kubenswrapper[4783]: I0131 09:06:13.903206 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:13Z","lastTransitionTime":"2026-01-31T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.004533 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.004561 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.004569 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.004601 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.004610 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:14Z","lastTransitionTime":"2026-01-31T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.106091 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.106121 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.106129 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.106138 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.106178 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:14Z","lastTransitionTime":"2026-01-31T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.207716 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.207755 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.207763 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.207771 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.207777 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:14Z","lastTransitionTime":"2026-01-31T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.309638 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.309671 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.309679 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.309689 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.309695 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:14Z","lastTransitionTime":"2026-01-31T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.411786 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.411817 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.411826 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.411835 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.411842 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:14Z","lastTransitionTime":"2026-01-31T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.513485 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.513522 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.513530 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.513539 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.513546 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:14Z","lastTransitionTime":"2026-01-31T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.615383 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.615408 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.615416 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.615425 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.615435 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:14Z","lastTransitionTime":"2026-01-31T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.655031 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 09:12:25.427827957 +0000 UTC Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.717594 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.717628 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.717636 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.717649 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.717671 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:14Z","lastTransitionTime":"2026-01-31T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.819197 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.819230 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.819240 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.819267 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.819276 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:14Z","lastTransitionTime":"2026-01-31T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.920765 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.920794 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.920803 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.920813 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:14 crc kubenswrapper[4783]: I0131 09:06:14.920821 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:14Z","lastTransitionTime":"2026-01-31T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.022599 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.022647 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.022658 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.022678 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.022687 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:15Z","lastTransitionTime":"2026-01-31T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.124022 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.124054 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.124062 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.124071 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.124079 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:15Z","lastTransitionTime":"2026-01-31T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.226009 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.226040 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.226049 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.226059 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.226068 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:15Z","lastTransitionTime":"2026-01-31T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.327859 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.327921 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.327939 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.327963 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.327981 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:15Z","lastTransitionTime":"2026-01-31T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.429493 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.429527 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.429536 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.429548 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.429564 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:15Z","lastTransitionTime":"2026-01-31T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.530868 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.530902 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.530911 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.530923 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.530935 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:15Z","lastTransitionTime":"2026-01-31T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.633105 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.633126 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.633134 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.633148 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.633156 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:15Z","lastTransitionTime":"2026-01-31T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.645496 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:15 crc kubenswrapper[4783]: E0131 09:06:15.645582 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.645646 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:15 crc kubenswrapper[4783]: E0131 09:06:15.645708 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.645866 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.645943 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:15 crc kubenswrapper[4783]: E0131 09:06:15.646085 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:15 crc kubenswrapper[4783]: E0131 09:06:15.646229 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.656013 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 12:03:39.900867034 +0000 UTC Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.735015 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.735043 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.735051 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.735060 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.735067 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:15Z","lastTransitionTime":"2026-01-31T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.836929 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.836971 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.836981 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.836992 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.837002 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:15Z","lastTransitionTime":"2026-01-31T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.939545 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.939587 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.939602 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.939624 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:15 crc kubenswrapper[4783]: I0131 09:06:15.939638 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:15Z","lastTransitionTime":"2026-01-31T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.041889 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.041913 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.041924 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.041936 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.041944 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:16Z","lastTransitionTime":"2026-01-31T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.143901 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.143933 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.143946 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.143961 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.143972 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:16Z","lastTransitionTime":"2026-01-31T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.245710 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.245826 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.245895 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.245955 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.246017 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:16Z","lastTransitionTime":"2026-01-31T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.347189 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.347225 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.347235 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.347248 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.347257 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:16Z","lastTransitionTime":"2026-01-31T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.449200 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.449227 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.449237 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.449247 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.449254 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:16Z","lastTransitionTime":"2026-01-31T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.551484 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.551512 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.551538 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.551550 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.551557 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:16Z","lastTransitionTime":"2026-01-31T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.645878 4783 scope.go:117] "RemoveContainer" containerID="7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3" Jan 31 09:06:16 crc kubenswrapper[4783]: E0131 09:06:16.646040 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.653322 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.653344 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.653352 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.653363 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.653370 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:16Z","lastTransitionTime":"2026-01-31T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.656602 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 15:34:22.893969099 +0000 UTC Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.755681 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.755746 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.755762 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.755787 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.755800 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:16Z","lastTransitionTime":"2026-01-31T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.858454 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.858493 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.858504 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.858517 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.858527 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:16Z","lastTransitionTime":"2026-01-31T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.960296 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.960336 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.960348 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.960369 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:16 crc kubenswrapper[4783]: I0131 09:06:16.960379 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:16Z","lastTransitionTime":"2026-01-31T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.062450 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.062747 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.062755 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.062766 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.062773 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:17Z","lastTransitionTime":"2026-01-31T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.164601 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.164631 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.164644 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.164657 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.164667 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:17Z","lastTransitionTime":"2026-01-31T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.266803 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.266838 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.266847 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.266857 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.266865 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:17Z","lastTransitionTime":"2026-01-31T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.368091 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.368214 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.368293 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.368365 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.368435 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:17Z","lastTransitionTime":"2026-01-31T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.469582 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.469740 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.469813 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.469884 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.469965 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:17Z","lastTransitionTime":"2026-01-31T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.571214 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.571244 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.571252 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.571264 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.571271 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:17Z","lastTransitionTime":"2026-01-31T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.644925 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:17 crc kubenswrapper[4783]: E0131 09:06:17.645017 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.645149 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:17 crc kubenswrapper[4783]: E0131 09:06:17.645236 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.645347 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.645522 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:17 crc kubenswrapper[4783]: E0131 09:06:17.645668 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:17 crc kubenswrapper[4783]: E0131 09:06:17.645828 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.657120 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 08:57:34.127918632 +0000 UTC Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.673362 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.673446 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.673510 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.673571 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.673625 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:17Z","lastTransitionTime":"2026-01-31T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.775786 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.775813 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.775823 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.775835 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.775845 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:17Z","lastTransitionTime":"2026-01-31T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.877557 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.877584 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.877592 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.877602 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.877611 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:17Z","lastTransitionTime":"2026-01-31T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.979154 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.979251 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.979263 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.979273 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:17 crc kubenswrapper[4783]: I0131 09:06:17.979281 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:17Z","lastTransitionTime":"2026-01-31T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.080790 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.080842 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.080857 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.080874 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.080888 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:18Z","lastTransitionTime":"2026-01-31T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.182570 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.182599 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.182608 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.182617 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.182625 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:18Z","lastTransitionTime":"2026-01-31T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.284000 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.284029 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.284038 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.284051 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.284059 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:18Z","lastTransitionTime":"2026-01-31T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.385716 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.385744 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.385752 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.385763 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.385773 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:18Z","lastTransitionTime":"2026-01-31T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.487226 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.487254 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.487263 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.487274 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.487282 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:18Z","lastTransitionTime":"2026-01-31T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.589334 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.589362 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.589370 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.589380 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.589388 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:18Z","lastTransitionTime":"2026-01-31T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.657459 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 02:21:08.298609339 +0000 UTC Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.691488 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.691511 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.691521 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.691531 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.691538 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:18Z","lastTransitionTime":"2026-01-31T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.793708 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.793739 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.793747 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.793758 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.793765 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:18Z","lastTransitionTime":"2026-01-31T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.896093 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.896125 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.896133 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.896145 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.896154 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:18Z","lastTransitionTime":"2026-01-31T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.997326 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.997351 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.997360 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.997370 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:18 crc kubenswrapper[4783]: I0131 09:06:18.997377 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:18Z","lastTransitionTime":"2026-01-31T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.099462 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.099495 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.099503 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.099517 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.099526 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:19Z","lastTransitionTime":"2026-01-31T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.201268 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.201307 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.201321 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.201338 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.201350 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:19Z","lastTransitionTime":"2026-01-31T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.303572 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.303610 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.303621 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.303632 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.303640 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:19Z","lastTransitionTime":"2026-01-31T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.405857 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.405891 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.405899 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.405915 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.405924 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:19Z","lastTransitionTime":"2026-01-31T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.507582 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.507735 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.507812 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.507881 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.507938 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:19Z","lastTransitionTime":"2026-01-31T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.609546 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.609579 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.609589 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.609602 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.609610 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:19Z","lastTransitionTime":"2026-01-31T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.645015 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.645044 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.645070 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.645089 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:19 crc kubenswrapper[4783]: E0131 09:06:19.645407 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:19 crc kubenswrapper[4783]: E0131 09:06:19.645459 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:19 crc kubenswrapper[4783]: E0131 09:06:19.645507 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:19 crc kubenswrapper[4783]: E0131 09:06:19.645542 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.653924 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.655982 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.657564 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 09:52:45.235048735 +0000 UTC Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.664813 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.673417 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.685642 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.694150 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.702261 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc94916-57ef-4e93-810b-8d1772d40130\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acd5c1f71bcf95041d31c6c0858e8f6b02aac2d44e208a25228924199cfa8527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab2f851c6a3148330f3e19352e904c099dc3b3e68f664cbc62aec049e50e5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae0cca9ef3d64d4323f775d56154dd4d95078b142c429e8431580734b7fa739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.710644 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.710758 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.710817 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.710879 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.710939 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:19Z","lastTransitionTime":"2026-01-31T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.715849 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.723753 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.733300 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.741903 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6549acc80da6fb5d0a453b1ca130d10ba98d32d99b730e3ba9ce4cb1e68e87d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:58Z\\\",\\\"message\\\":\\\"2026-01-31T09:05:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287\\\\n2026-01-31T09:05:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287 to /host/opt/cni/bin/\\\\n2026-01-31T09:05:13Z [verbose] multus-daemon started\\\\n2026-01-31T09:05:13Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:05:58Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.748940 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.756350 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.763953 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.770497 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.777233 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.788783 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:06:02Z\\\",\\\"message\\\":\\\"e]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:06:02.334513 6835 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:06:02.334578 6835 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0131 09:06:02.334680 6835 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0131 09:06:02.334712 6835 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:06:02.334746 6835 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:06:02.334824 6835 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:06:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.795075 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.801633 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:19Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.812850 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.812877 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.812886 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.812898 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.812906 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:19Z","lastTransitionTime":"2026-01-31T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.915108 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.915139 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.915183 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.915200 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:19 crc kubenswrapper[4783]: I0131 09:06:19.915211 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:19Z","lastTransitionTime":"2026-01-31T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.016581 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.016610 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.016620 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.016630 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.016638 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:20Z","lastTransitionTime":"2026-01-31T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.118411 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.118442 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.118450 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.118460 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.118468 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:20Z","lastTransitionTime":"2026-01-31T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.220224 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.220256 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.220265 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.220277 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.220285 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:20Z","lastTransitionTime":"2026-01-31T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.322103 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.322136 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.322146 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.322157 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.322184 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:20Z","lastTransitionTime":"2026-01-31T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.423482 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.423514 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.423523 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.423538 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.423547 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:20Z","lastTransitionTime":"2026-01-31T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.525462 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.525493 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.525502 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.525511 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.525522 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:20Z","lastTransitionTime":"2026-01-31T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.627312 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.627343 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.627354 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.627366 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.627394 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:20Z","lastTransitionTime":"2026-01-31T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.658073 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 00:49:05.487714913 +0000 UTC Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.728791 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.728827 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.728837 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.728849 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.728859 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:20Z","lastTransitionTime":"2026-01-31T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.830750 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.830805 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.830816 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.830830 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.830841 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:20Z","lastTransitionTime":"2026-01-31T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.932200 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.932247 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.932262 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.932283 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:20 crc kubenswrapper[4783]: I0131 09:06:20.932296 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:20Z","lastTransitionTime":"2026-01-31T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.033540 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.033580 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.033590 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.033604 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.033614 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:21Z","lastTransitionTime":"2026-01-31T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.134866 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.135076 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.135147 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.135398 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.135470 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:21Z","lastTransitionTime":"2026-01-31T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.237580 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.237665 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.237734 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.237787 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.237844 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:21Z","lastTransitionTime":"2026-01-31T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.339233 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.339326 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.339469 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.339531 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.339592 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:21Z","lastTransitionTime":"2026-01-31T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.441022 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.441050 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.441061 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.441071 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.441078 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:21Z","lastTransitionTime":"2026-01-31T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.542519 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.542538 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.542545 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.542553 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.542559 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:21Z","lastTransitionTime":"2026-01-31T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.644145 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.644266 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.644325 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.644380 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.644432 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:21Z","lastTransitionTime":"2026-01-31T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.644685 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.644762 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.644767 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:21 crc kubenswrapper[4783]: E0131 09:06:21.644918 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.644939 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:21 crc kubenswrapper[4783]: E0131 09:06:21.644994 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:21 crc kubenswrapper[4783]: E0131 09:06:21.645054 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:21 crc kubenswrapper[4783]: E0131 09:06:21.645086 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.658422 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 15:52:13.310301125 +0000 UTC Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.745727 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.745793 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.745808 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.745823 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.745856 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:21Z","lastTransitionTime":"2026-01-31T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.816887 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.816918 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.816927 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.816959 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.816967 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:21Z","lastTransitionTime":"2026-01-31T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:21 crc kubenswrapper[4783]: E0131 09:06:21.825998 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.829046 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.829082 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.829092 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.829105 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.829114 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:21Z","lastTransitionTime":"2026-01-31T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:21 crc kubenswrapper[4783]: E0131 09:06:21.836653 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.839042 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.839068 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.839075 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.839084 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.839091 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:21Z","lastTransitionTime":"2026-01-31T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:21 crc kubenswrapper[4783]: E0131 09:06:21.848058 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.850203 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.850225 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.850234 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.850245 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.850252 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:21Z","lastTransitionTime":"2026-01-31T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:21 crc kubenswrapper[4783]: E0131 09:06:21.857771 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.859936 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.859965 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.859992 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.860003 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.860010 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:21Z","lastTransitionTime":"2026-01-31T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:21 crc kubenswrapper[4783]: E0131 09:06:21.867640 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:21 crc kubenswrapper[4783]: E0131 09:06:21.867773 4783 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.868593 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.868621 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.868632 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.868645 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.868653 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:21Z","lastTransitionTime":"2026-01-31T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.971013 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.971053 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.971063 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.971079 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:21 crc kubenswrapper[4783]: I0131 09:06:21.971093 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:21Z","lastTransitionTime":"2026-01-31T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.073335 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.073380 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.073389 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.073405 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.073414 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:22Z","lastTransitionTime":"2026-01-31T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.175204 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.175237 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.175247 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.175260 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.175269 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:22Z","lastTransitionTime":"2026-01-31T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.276474 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.276503 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.276512 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.276522 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.276529 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:22Z","lastTransitionTime":"2026-01-31T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.377840 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.377870 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.377879 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.377888 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.377896 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:22Z","lastTransitionTime":"2026-01-31T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.479744 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.479796 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.479805 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.479818 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.479826 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:22Z","lastTransitionTime":"2026-01-31T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.581271 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.581308 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.581318 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.581332 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.581341 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:22Z","lastTransitionTime":"2026-01-31T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.659467 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 06:17:51.614438955 +0000 UTC Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.682662 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.682720 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.682731 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.682743 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.682752 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:22Z","lastTransitionTime":"2026-01-31T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.784541 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.784568 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.784576 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.784585 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.784592 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:22Z","lastTransitionTime":"2026-01-31T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.886118 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.886151 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.886182 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.886196 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.886205 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:22Z","lastTransitionTime":"2026-01-31T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.988485 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.988519 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.988529 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.988539 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:22 crc kubenswrapper[4783]: I0131 09:06:22.988551 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:22Z","lastTransitionTime":"2026-01-31T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.089941 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.089969 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.089977 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.089986 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.089993 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:23Z","lastTransitionTime":"2026-01-31T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.191754 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.191800 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.191810 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.191820 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.191827 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:23Z","lastTransitionTime":"2026-01-31T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.293140 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.293195 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.293205 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.293215 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.293222 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:23Z","lastTransitionTime":"2026-01-31T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.394829 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.394857 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.394866 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.394875 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.394882 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:23Z","lastTransitionTime":"2026-01-31T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.496342 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.496384 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.496397 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.496412 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.496424 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:23Z","lastTransitionTime":"2026-01-31T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.598638 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.598674 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.598683 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.598697 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.598705 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:23Z","lastTransitionTime":"2026-01-31T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.645573 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.645601 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.645675 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:23 crc kubenswrapper[4783]: E0131 09:06:23.645787 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.645928 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:23 crc kubenswrapper[4783]: E0131 09:06:23.645981 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:23 crc kubenswrapper[4783]: E0131 09:06:23.646082 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:23 crc kubenswrapper[4783]: E0131 09:06:23.646197 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.660354 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 10:38:44.594258856 +0000 UTC Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.700637 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.700755 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.700766 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.700778 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.700786 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:23Z","lastTransitionTime":"2026-01-31T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.802429 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.802460 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.802472 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.802484 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.802495 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:23Z","lastTransitionTime":"2026-01-31T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.903841 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.903874 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.903882 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.903895 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:23 crc kubenswrapper[4783]: I0131 09:06:23.903904 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:23Z","lastTransitionTime":"2026-01-31T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.005993 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.006028 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.006037 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.006051 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.006059 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:24Z","lastTransitionTime":"2026-01-31T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.107828 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.107860 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.107868 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.107883 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.107891 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:24Z","lastTransitionTime":"2026-01-31T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.209891 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.209917 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.209926 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.209937 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.209945 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:24Z","lastTransitionTime":"2026-01-31T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.311523 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.311567 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.311580 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.311595 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.311607 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:24Z","lastTransitionTime":"2026-01-31T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.413315 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.413358 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.413367 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.413376 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.413384 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:24Z","lastTransitionTime":"2026-01-31T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.515008 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.515054 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.515071 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.515088 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.515100 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:24Z","lastTransitionTime":"2026-01-31T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.616641 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.616676 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.616686 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.616699 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.616707 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:24Z","lastTransitionTime":"2026-01-31T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.660699 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 03:11:54.110162874 +0000 UTC Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.719366 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.719403 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.719412 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.719422 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.719430 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:24Z","lastTransitionTime":"2026-01-31T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.821621 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.821682 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.821694 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.821715 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.821743 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:24Z","lastTransitionTime":"2026-01-31T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.923689 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.923740 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.923750 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.923766 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:24 crc kubenswrapper[4783]: I0131 09:06:24.923775 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:24Z","lastTransitionTime":"2026-01-31T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.025591 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.025641 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.025654 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.025672 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.025684 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:25Z","lastTransitionTime":"2026-01-31T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.127700 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.127759 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.127769 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.127787 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.127801 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:25Z","lastTransitionTime":"2026-01-31T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.229639 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.229672 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.229682 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.229694 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.229718 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:25Z","lastTransitionTime":"2026-01-31T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.331747 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.331786 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.331795 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.331812 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.331822 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:25Z","lastTransitionTime":"2026-01-31T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.433717 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.433767 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.433777 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.433789 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.433796 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:25Z","lastTransitionTime":"2026-01-31T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.535775 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.535819 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.535828 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.535853 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.535883 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:25Z","lastTransitionTime":"2026-01-31T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.637460 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.637486 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.637494 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.637507 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.637520 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:25Z","lastTransitionTime":"2026-01-31T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.644953 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.644965 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:25 crc kubenswrapper[4783]: E0131 09:06:25.645085 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.644969 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.645121 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:25 crc kubenswrapper[4783]: E0131 09:06:25.645267 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:25 crc kubenswrapper[4783]: E0131 09:06:25.645442 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:25 crc kubenswrapper[4783]: E0131 09:06:25.645493 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.661534 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 16:02:56.803021117 +0000 UTC Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.738859 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.738885 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.738895 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.738905 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.738913 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:25Z","lastTransitionTime":"2026-01-31T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.840668 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.840690 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.840698 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.840709 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.840718 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:25Z","lastTransitionTime":"2026-01-31T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.941697 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.941724 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.941739 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.941749 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:25 crc kubenswrapper[4783]: I0131 09:06:25.941755 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:25Z","lastTransitionTime":"2026-01-31T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.043261 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.043302 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.043332 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.043344 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.043352 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:26Z","lastTransitionTime":"2026-01-31T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.145086 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.145116 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.145126 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.145138 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.145145 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:26Z","lastTransitionTime":"2026-01-31T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.246684 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.246748 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.246761 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.246771 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.246778 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:26Z","lastTransitionTime":"2026-01-31T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.348539 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.348569 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.348579 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.348609 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.348619 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:26Z","lastTransitionTime":"2026-01-31T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.450327 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.450359 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.450369 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.450379 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.450388 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:26Z","lastTransitionTime":"2026-01-31T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.551466 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.551495 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.551502 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.551513 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.551520 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:26Z","lastTransitionTime":"2026-01-31T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.652463 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.652491 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.652500 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.652510 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.652517 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:26Z","lastTransitionTime":"2026-01-31T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.661676 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 00:16:37.448227678 +0000 UTC Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.754224 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.754243 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.754251 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.754260 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.754267 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:26Z","lastTransitionTime":"2026-01-31T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.856121 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.856185 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.856194 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.856207 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.856454 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:26Z","lastTransitionTime":"2026-01-31T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.958222 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.958256 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.958265 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.958280 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:26 crc kubenswrapper[4783]: I0131 09:06:26.958288 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:26Z","lastTransitionTime":"2026-01-31T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.059526 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.059654 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.059745 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.059811 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.059871 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:27Z","lastTransitionTime":"2026-01-31T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.161400 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.161454 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.161464 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.161480 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.161491 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:27Z","lastTransitionTime":"2026-01-31T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.262803 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.262828 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.262836 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.262846 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.262854 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:27Z","lastTransitionTime":"2026-01-31T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.364593 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.364700 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.364861 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.364996 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.365122 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:27Z","lastTransitionTime":"2026-01-31T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.466320 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.466412 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.466479 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.466536 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.466602 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:27Z","lastTransitionTime":"2026-01-31T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.568539 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.568567 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.568575 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.568586 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.568593 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:27Z","lastTransitionTime":"2026-01-31T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.645371 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.645556 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:27 crc kubenswrapper[4783]: E0131 09:06:27.645623 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.645652 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.645679 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:27 crc kubenswrapper[4783]: E0131 09:06:27.645844 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:27 crc kubenswrapper[4783]: E0131 09:06:27.645897 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:27 crc kubenswrapper[4783]: E0131 09:06:27.646049 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.662703 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 22:42:39.49895625 +0000 UTC Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.669729 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.669774 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.669783 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.669797 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.669805 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:27Z","lastTransitionTime":"2026-01-31T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.771549 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.771577 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.771588 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.771614 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.771623 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:27Z","lastTransitionTime":"2026-01-31T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.873243 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.873351 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.873424 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.873498 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.873556 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:27Z","lastTransitionTime":"2026-01-31T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.974951 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.974979 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.974989 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.974999 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:27 crc kubenswrapper[4783]: I0131 09:06:27.975006 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:27Z","lastTransitionTime":"2026-01-31T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.076282 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.076372 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.076452 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.076529 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.076671 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:28Z","lastTransitionTime":"2026-01-31T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.178115 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.178138 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.178146 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.178176 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.178184 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:28Z","lastTransitionTime":"2026-01-31T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.279460 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.279487 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.279495 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.279504 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.279510 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:28Z","lastTransitionTime":"2026-01-31T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.381443 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.381478 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.381487 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.381502 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.381511 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:28Z","lastTransitionTime":"2026-01-31T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.483522 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.483548 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.483556 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.483564 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.483571 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:28Z","lastTransitionTime":"2026-01-31T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.584856 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.584889 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.584900 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.584912 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.584922 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:28Z","lastTransitionTime":"2026-01-31T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.663374 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 22:25:07.955205215 +0000 UTC Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.686762 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.686786 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.686796 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.686808 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.686817 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:28Z","lastTransitionTime":"2026-01-31T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.788479 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.788509 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.788518 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.788528 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.788535 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:28Z","lastTransitionTime":"2026-01-31T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.890452 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.890483 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.890493 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.890506 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.890514 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:28Z","lastTransitionTime":"2026-01-31T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.991675 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.991700 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.991709 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.991719 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:28 crc kubenswrapper[4783]: I0131 09:06:28.991727 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:28Z","lastTransitionTime":"2026-01-31T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.093080 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.093110 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.093117 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.093129 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.093138 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:29Z","lastTransitionTime":"2026-01-31T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.195814 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.195862 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.195874 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.195893 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.195904 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:29Z","lastTransitionTime":"2026-01-31T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.297102 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.297143 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.297191 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.297209 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.297218 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:29Z","lastTransitionTime":"2026-01-31T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.399074 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.399101 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.399109 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.399120 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.399129 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:29Z","lastTransitionTime":"2026-01-31T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.500595 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.500620 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.500630 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.500641 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.500659 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:29Z","lastTransitionTime":"2026-01-31T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.602015 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.602049 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.602058 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.602274 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.602305 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:29Z","lastTransitionTime":"2026-01-31T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.645519 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.645557 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.645566 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:29 crc kubenswrapper[4783]: E0131 09:06:29.645645 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.645673 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:29 crc kubenswrapper[4783]: E0131 09:06:29.645757 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:29 crc kubenswrapper[4783]: E0131 09:06:29.645775 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:29 crc kubenswrapper[4783]: E0131 09:06:29.645811 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.654713 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mwdww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc9cc5f-1f3b-46b6-bf0c-b558160f9299\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1942ed40441dce753dd2e9d59953a6b3a6b662a42bdc65453d5d251c94c7f565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhcln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mwdww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.661946 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84961ed7-35f8-4e6a-987c-cabb84cf7268\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-676hz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xg6x2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.663617 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 08:00:13.414739512 +0000 UTC Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.669816 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9008ddbe551acdbf0c85f1c1ed7417ac0062f2e9628848477b92411d437388\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.677717 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.684250 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-99m9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d46d748b-9274-46b0-9954-e55aaec61853\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dbe6cda3043e7cb63926b4c233f6240ae95ce61c1c629042425b8756e668942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ds7kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-99m9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.691022 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d527565e443212adc88c6b4fe8edf392436d219d9aa43ed621b0a87b0b35fa0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ffjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqnx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.702673 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b3d03a1-7611-470d-a402-4f40ce95a54f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:06:02Z\\\",\\\"message\\\":\\\"e]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:06:02.334513 6835 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:06:02.334578 6835 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0131 09:06:02.334680 6835 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0131 09:06:02.334712 6835 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:06:02.334746 6835 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:06:02.334824 6835 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:06:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqc7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vr882\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.703299 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.703323 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.703332 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.703344 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.703352 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:29Z","lastTransitionTime":"2026-01-31T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.710455 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729d3f64-23f4-47e8-8371-492143c2a4a8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://052e53334214d5a07851eed05c78b6d3530aa8aacea9f074be93c1305a991c3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef7edee2ca6f502b42b699e110ef484b0f0da86cf3d2015cf8321e2e8864f7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef7edee2ca6f502b42b699e110ef484b0f0da86cf3d2015cf8321e2e8864f7fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.717810 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.725437 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba1329378e82f2fec7d206b2ebfe8202799c1a5a9a784d6236cdf130d650df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.732626 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d480bf87d2b735df68e977c10025d9344730831f226804482aeadb43850924dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15e9615a944266c442bf63738317da8ac49ce85a22b7b70179383385c83a001e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.740735 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7dcdae0-fa0c-477f-93ca-03afdca81d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:05:06Z\\\",\\\"message\\\":\\\" 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\nI0131 09:05:06.840441 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:05:06.843614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:05:06.843640 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:05:06.843668 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:05:06.843673 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:05:06.848836 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:05:06.848861 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848865 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:05:06.848870 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:05:06.848873 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:05:06.848876 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:05:06.848879 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0131 09:05:06.849045 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0131 09:05:06.850619 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0131 09:05:06.851669 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0131 09:05:06.851678 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2628542252/tls.crt::/tmp/serving-cert-2628542252/tls.key\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.748101 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"679d0a09-6784-40ec-b05c-c13c3c1211c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47230d6cacae507286532f7482a9e6edde8a378283e55741e3195152b921ca68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fec72fd183d85d1fa948e6cfc5d2c0e1973568d91fa1b31b50cbcd3f62d274a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dccf059282c95da927d1457903e1dc3b528968bb76fa41623a2b0a5ad87351b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.757270 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc94916-57ef-4e93-810b-8d1772d40130\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acd5c1f71bcf95041d31c6c0858e8f6b02aac2d44e208a25228924199cfa8527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab2f851c6a3148330f3e19352e904c099dc3b3e68f664cbc62aec049e50e5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aae0cca9ef3d64d4323f775d56154dd4d95078b142c429e8431580734b7fa739\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95e4b44ec59dad20f1ed9545850a9ce1fc9496df55f82fa2552afeac198579\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.773156 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27413e5b-30a1-474b-99ed-cd4204e35a54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c980819309f4367006dd60aff2d0156f3852bfdf5b455b736bba909b965b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac59fbfcff2042f88b5849d9cb6fbde9e23ef9ef42d2717e6b9f751fa447768d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68326bb9225a50426b148f52b48ab8da33fb9d5ab9b9b9e97e55a2a1a2f51ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52cf7714cbc7c0f90cb5ae8d35b90e30e24c005cd99a2cdace5ada52d3862425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5812a8f9374a2149a90a729bc9f3e6bb8836eb075c6f5676a49ce982f70a25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d82919e782ad413c58c294f221b842fc3c63090b6894b94ac4d4691ee437cfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d721e32256307aed2c89321611200ee7fa7bd10c146cdb374fdeb15046c039f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21feec274ffe68fd517bcccfcd1bfbd56dc4b09acb9bc964b212dd9f017c926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:04:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.780665 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.789707 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04e04066-c510-4203-90b8-3296993cb94f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29a28c4e5c9d7564f353b9c9c39e99f1700f7b51c867daeea13f74c8864cad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e6c76279fa1a5f38a252c1f09dbd5779e1258871a2a334eb4ce82a95bca97c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dc5ac6d9c5c7ff29b071bd96189a5cb4654c1734aed9d80d631111f9065c3ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6baf39db30c03966b65e9f1121a392b88ae0321d32a275645b133fa8e6e567f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4138afebbb3baab33cae5976487b2376d45194f9b4e1d002185655d2f533e2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecd453a1d59d1bb640d155ac9f92e5ee488841fe4da4b51a15f6c924d03c3aae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4761936d3c6b41acc675a10401acce54b5947d4c1dd9672c3a8e32067ecc4f20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hllrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6h2bb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.798668 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q8td" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5ffe9c-191a-4902-8e13-6a869f158784\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6549acc80da6fb5d0a453b1ca130d10ba98d32d99b730e3ba9ce4cb1e68e87d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:05:58Z\\\",\\\"message\\\":\\\"2026-01-31T09:05:13+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287\\\\n2026-01-31T09:05:13+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_49d2ff87-2778-4915-a5f1-1cf17a4f6287 to /host/opt/cni/bin/\\\\n2026-01-31T09:05:13Z [verbose] multus-daemon started\\\\n2026-01-31T09:05:13Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:05:58Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:05:13Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-57xlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q8td\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.805351 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.805385 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.805394 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.805407 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.805415 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:29Z","lastTransitionTime":"2026-01-31T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.806065 4783 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1725409a-e6c3-4770-8341-2e390ff5e44b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0157eb859369682633efcd5b6dd2fd7bb8d7a6f6ff8b54dd4d6b390e991e62dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762d975f03cc4e747ae6aeb9a13425528477663cae6fc8360b89d535c0ab718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7kqkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:05:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jslrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.907587 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.907637 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.907651 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.907665 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:29 crc kubenswrapper[4783]: I0131 09:06:29.907673 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:29Z","lastTransitionTime":"2026-01-31T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.009038 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.009132 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.009206 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.009287 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.009363 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:30Z","lastTransitionTime":"2026-01-31T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.110921 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.110960 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.110970 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.110980 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.110988 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:30Z","lastTransitionTime":"2026-01-31T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.177445 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs\") pod \"network-metrics-daemon-xg6x2\" (UID: \"84961ed7-35f8-4e6a-987c-cabb84cf7268\") " pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:30 crc kubenswrapper[4783]: E0131 09:06:30.177588 4783 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:06:30 crc kubenswrapper[4783]: E0131 09:06:30.177636 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs podName:84961ed7-35f8-4e6a-987c-cabb84cf7268 nodeName:}" failed. No retries permitted until 2026-01-31 09:07:34.177619263 +0000 UTC m=+164.846302741 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs") pod "network-metrics-daemon-xg6x2" (UID: "84961ed7-35f8-4e6a-987c-cabb84cf7268") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.212489 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.212519 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.212531 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.212542 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.212551 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:30Z","lastTransitionTime":"2026-01-31T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.314510 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.314533 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.314542 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.314551 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.314559 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:30Z","lastTransitionTime":"2026-01-31T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.416804 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.416827 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.416835 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.416844 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.416850 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:30Z","lastTransitionTime":"2026-01-31T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.518722 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.518779 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.518791 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.518804 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.518812 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:30Z","lastTransitionTime":"2026-01-31T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.619829 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.619853 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.619863 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.619873 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.619880 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:30Z","lastTransitionTime":"2026-01-31T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.664217 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 03:28:13.054612398 +0000 UTC Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.721732 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.721769 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.721780 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.721790 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.721798 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:30Z","lastTransitionTime":"2026-01-31T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.823720 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.823765 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.823774 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.823786 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.823795 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:30Z","lastTransitionTime":"2026-01-31T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.925667 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.925693 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.925701 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.925710 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:30 crc kubenswrapper[4783]: I0131 09:06:30.925716 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:30Z","lastTransitionTime":"2026-01-31T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.027155 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.027203 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.027211 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.027223 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.027231 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:31Z","lastTransitionTime":"2026-01-31T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.128574 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.128610 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.128621 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.128633 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.128641 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:31Z","lastTransitionTime":"2026-01-31T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.230293 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.230333 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.230342 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.230354 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.230362 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:31Z","lastTransitionTime":"2026-01-31T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.331716 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.331752 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.331774 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.331787 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.331795 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:31Z","lastTransitionTime":"2026-01-31T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.433500 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.433525 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.433533 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.433544 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.433552 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:31Z","lastTransitionTime":"2026-01-31T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.534984 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.535007 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.535014 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.535023 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.535029 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:31Z","lastTransitionTime":"2026-01-31T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.636729 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.636769 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.636779 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.636789 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.636796 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:31Z","lastTransitionTime":"2026-01-31T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.645154 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.645191 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.645199 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.645237 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:31 crc kubenswrapper[4783]: E0131 09:06:31.645269 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:31 crc kubenswrapper[4783]: E0131 09:06:31.645409 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:31 crc kubenswrapper[4783]: E0131 09:06:31.645505 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:31 crc kubenswrapper[4783]: E0131 09:06:31.645587 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.646046 4783 scope.go:117] "RemoveContainer" containerID="7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3" Jan 31 09:06:31 crc kubenswrapper[4783]: E0131 09:06:31.646189 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vr882_openshift-ovn-kubernetes(4b3d03a1-7611-470d-a402-4f40ce95a54f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.664719 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 06:07:19.808569639 +0000 UTC Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.738004 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.738030 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.738039 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.738048 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.738056 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:31Z","lastTransitionTime":"2026-01-31T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.839581 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.839621 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.839632 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.839648 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.839660 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:31Z","lastTransitionTime":"2026-01-31T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.941869 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.941930 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.941942 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.941964 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:31 crc kubenswrapper[4783]: I0131 09:06:31.941978 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:31Z","lastTransitionTime":"2026-01-31T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.043997 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.044037 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.044049 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.044064 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.044075 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:32Z","lastTransitionTime":"2026-01-31T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.146206 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.146253 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.146262 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.146276 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.146287 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:32Z","lastTransitionTime":"2026-01-31T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.160090 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.160132 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.160145 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.160181 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.160194 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:32Z","lastTransitionTime":"2026-01-31T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:32 crc kubenswrapper[4783]: E0131 09:06:32.170146 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.172269 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.172301 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.172313 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.172324 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.172332 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:32Z","lastTransitionTime":"2026-01-31T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:32 crc kubenswrapper[4783]: E0131 09:06:32.180179 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.182601 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.182646 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.182657 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.182670 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.182680 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:32Z","lastTransitionTime":"2026-01-31T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:32 crc kubenswrapper[4783]: E0131 09:06:32.190523 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.192614 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.192704 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.192718 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.192728 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.192737 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:32Z","lastTransitionTime":"2026-01-31T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:32 crc kubenswrapper[4783]: E0131 09:06:32.200505 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.202374 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.202394 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.202403 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.202412 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.202419 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:32Z","lastTransitionTime":"2026-01-31T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:32 crc kubenswrapper[4783]: E0131 09:06:32.210225 4783 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:06:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"acd87756-2b8a-4238-9ef4-5b9ef00df1bf\\\",\\\"systemUUID\\\":\\\"fb2fc674-10e7-4f52-98ab-a2501c80635b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:06:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:06:32 crc kubenswrapper[4783]: E0131 09:06:32.210332 4783 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.248238 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.248259 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.248267 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.248278 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.248285 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:32Z","lastTransitionTime":"2026-01-31T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.350070 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.350101 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.350112 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.350124 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.350132 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:32Z","lastTransitionTime":"2026-01-31T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.452338 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.452370 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.452382 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.452394 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.452402 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:32Z","lastTransitionTime":"2026-01-31T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.554548 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.554600 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.554611 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.554627 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.554639 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:32Z","lastTransitionTime":"2026-01-31T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.656079 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.656106 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.656117 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.656128 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.656136 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:32Z","lastTransitionTime":"2026-01-31T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.665569 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 06:27:39.383898723 +0000 UTC Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.757591 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.757624 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.757634 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.757649 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.757659 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:32Z","lastTransitionTime":"2026-01-31T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.859658 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.859689 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.859715 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.859726 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.859734 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:32Z","lastTransitionTime":"2026-01-31T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.962034 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.962070 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.962081 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.962094 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:32 crc kubenswrapper[4783]: I0131 09:06:32.962103 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:32Z","lastTransitionTime":"2026-01-31T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.064059 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.064091 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.064102 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.064112 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.064120 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:33Z","lastTransitionTime":"2026-01-31T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.166301 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.166323 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.166330 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.166339 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.166346 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:33Z","lastTransitionTime":"2026-01-31T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.268075 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.268111 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.268120 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.268136 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.268147 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:33Z","lastTransitionTime":"2026-01-31T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.369551 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.369574 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.369582 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.369591 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.369598 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:33Z","lastTransitionTime":"2026-01-31T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.470934 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.471153 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.471260 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.471338 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.471399 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:33Z","lastTransitionTime":"2026-01-31T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.573694 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.573724 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.573734 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.573748 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.573755 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:33Z","lastTransitionTime":"2026-01-31T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.645617 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:33 crc kubenswrapper[4783]: E0131 09:06:33.645710 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.645722 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.645626 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.645766 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:33 crc kubenswrapper[4783]: E0131 09:06:33.645822 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:33 crc kubenswrapper[4783]: E0131 09:06:33.645860 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:33 crc kubenswrapper[4783]: E0131 09:06:33.645902 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.665905 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 14:12:36.073748119 +0000 UTC Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.674930 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.674954 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.674962 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.674972 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.675124 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:33Z","lastTransitionTime":"2026-01-31T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.776836 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.776866 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.776875 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.776888 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.776897 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:33Z","lastTransitionTime":"2026-01-31T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.878055 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.878080 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.878089 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.878098 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.878106 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:33Z","lastTransitionTime":"2026-01-31T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.980027 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.980045 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.980053 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.980061 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:33 crc kubenswrapper[4783]: I0131 09:06:33.980067 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:33Z","lastTransitionTime":"2026-01-31T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.081762 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.081812 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.081821 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.081830 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.081837 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:34Z","lastTransitionTime":"2026-01-31T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.184298 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.184331 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.184340 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.184352 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.184362 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:34Z","lastTransitionTime":"2026-01-31T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.286415 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.286443 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.286451 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.286460 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.286467 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:34Z","lastTransitionTime":"2026-01-31T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.388594 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.388620 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.388628 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.388637 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.388644 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:34Z","lastTransitionTime":"2026-01-31T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.490119 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.490147 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.490156 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.490181 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.490190 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:34Z","lastTransitionTime":"2026-01-31T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.592431 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.592458 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.592467 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.592476 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.592484 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:34Z","lastTransitionTime":"2026-01-31T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.666820 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 13:53:35.938312738 +0000 UTC Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.693941 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.693975 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.693985 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.693997 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.694007 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:34Z","lastTransitionTime":"2026-01-31T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.795391 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.795421 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.795430 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.795440 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.795447 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:34Z","lastTransitionTime":"2026-01-31T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.896482 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.896512 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.896521 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.896530 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.896538 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:34Z","lastTransitionTime":"2026-01-31T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.998041 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.998075 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.998086 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.998100 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:34 crc kubenswrapper[4783]: I0131 09:06:34.998110 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:34Z","lastTransitionTime":"2026-01-31T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.099562 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.099585 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.099593 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.099603 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.099609 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:35Z","lastTransitionTime":"2026-01-31T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.201375 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.201404 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.201415 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.201424 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.201432 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:35Z","lastTransitionTime":"2026-01-31T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.302848 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.302877 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.302889 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.302899 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.302908 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:35Z","lastTransitionTime":"2026-01-31T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.405019 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.405041 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.405049 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.405058 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.405064 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:35Z","lastTransitionTime":"2026-01-31T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.506744 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.506779 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.506800 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.506812 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.506820 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:35Z","lastTransitionTime":"2026-01-31T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.608543 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.608571 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.608579 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.608589 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.608596 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:35Z","lastTransitionTime":"2026-01-31T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.645194 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.645226 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.645230 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:35 crc kubenswrapper[4783]: E0131 09:06:35.645280 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.645332 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:35 crc kubenswrapper[4783]: E0131 09:06:35.645453 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:35 crc kubenswrapper[4783]: E0131 09:06:35.645522 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:35 crc kubenswrapper[4783]: E0131 09:06:35.645594 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.667805 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 09:59:32.835627671 +0000 UTC Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.709780 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.709819 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.709829 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.709839 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.709846 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:35Z","lastTransitionTime":"2026-01-31T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.811194 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.811222 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.811230 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.811241 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.811249 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:35Z","lastTransitionTime":"2026-01-31T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.913084 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.913114 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.913121 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.913129 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:35 crc kubenswrapper[4783]: I0131 09:06:35.913136 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:35Z","lastTransitionTime":"2026-01-31T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.014645 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.014671 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.014680 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.014689 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.014696 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:36Z","lastTransitionTime":"2026-01-31T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.116519 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.116546 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.116554 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.116563 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.116570 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:36Z","lastTransitionTime":"2026-01-31T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.218921 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.218949 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.218957 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.218966 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.218974 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:36Z","lastTransitionTime":"2026-01-31T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.320833 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.320867 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.320875 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.320885 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.320894 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:36Z","lastTransitionTime":"2026-01-31T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.423472 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.423581 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.423655 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.423712 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.423772 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:36Z","lastTransitionTime":"2026-01-31T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.525884 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.525909 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.525916 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.525925 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.525932 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:36Z","lastTransitionTime":"2026-01-31T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.627827 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.627856 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.627865 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.627874 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.627880 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:36Z","lastTransitionTime":"2026-01-31T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.668619 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 07:52:26.969752438 +0000 UTC Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.729884 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.729984 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.729995 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.730004 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.730011 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:36Z","lastTransitionTime":"2026-01-31T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.831704 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.831727 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.831735 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.831746 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.831756 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:36Z","lastTransitionTime":"2026-01-31T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.933462 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.933489 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.933499 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.933530 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:36 crc kubenswrapper[4783]: I0131 09:06:36.933540 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:36Z","lastTransitionTime":"2026-01-31T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.035710 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.035738 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.035746 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.035755 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.035762 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:37Z","lastTransitionTime":"2026-01-31T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.138102 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.138135 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.138145 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.138157 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.138199 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:37Z","lastTransitionTime":"2026-01-31T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.239866 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.239896 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.239904 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.239915 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.239922 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:37Z","lastTransitionTime":"2026-01-31T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.341846 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.341872 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.341882 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.341891 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.341897 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:37Z","lastTransitionTime":"2026-01-31T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.443339 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.443360 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.443368 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.443393 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.443407 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:37Z","lastTransitionTime":"2026-01-31T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.545435 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.545471 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.545480 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.545507 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.545514 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:37Z","lastTransitionTime":"2026-01-31T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.644641 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.644656 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:37 crc kubenswrapper[4783]: E0131 09:06:37.644724 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.644753 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.644784 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:37 crc kubenswrapper[4783]: E0131 09:06:37.644845 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:37 crc kubenswrapper[4783]: E0131 09:06:37.644878 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:37 crc kubenswrapper[4783]: E0131 09:06:37.644920 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.646447 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.646471 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.646479 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.646489 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.646498 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:37Z","lastTransitionTime":"2026-01-31T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.668856 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 02:24:17.920031323 +0000 UTC Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.748431 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.748450 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.748458 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.748467 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.748474 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:37Z","lastTransitionTime":"2026-01-31T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.850805 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.850837 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.850847 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.850860 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.850870 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:37Z","lastTransitionTime":"2026-01-31T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.952380 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.952406 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.952414 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.952423 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:37 crc kubenswrapper[4783]: I0131 09:06:37.952430 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:37Z","lastTransitionTime":"2026-01-31T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.053542 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.053565 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.053573 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.053585 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.053594 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:38Z","lastTransitionTime":"2026-01-31T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.155189 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.155214 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.155221 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.155229 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.155236 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:38Z","lastTransitionTime":"2026-01-31T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.256077 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.256098 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.256105 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.256113 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.256120 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:38Z","lastTransitionTime":"2026-01-31T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.357978 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.358000 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.358008 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.358017 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.358024 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:38Z","lastTransitionTime":"2026-01-31T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.459601 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.459624 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.459632 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.459641 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.459648 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:38Z","lastTransitionTime":"2026-01-31T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.561710 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.561736 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.561744 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.561755 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.561762 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:38Z","lastTransitionTime":"2026-01-31T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.663519 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.663544 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.663552 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.663561 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.663569 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:38Z","lastTransitionTime":"2026-01-31T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.668964 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 01:02:27.306980764 +0000 UTC Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.764823 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.764845 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.764852 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.764862 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.764869 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:38Z","lastTransitionTime":"2026-01-31T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.866430 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.866462 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.866473 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.866485 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.866494 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:38Z","lastTransitionTime":"2026-01-31T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.968112 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.968138 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.968148 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.968157 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:38 crc kubenswrapper[4783]: I0131 09:06:38.968178 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:38Z","lastTransitionTime":"2026-01-31T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.069755 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.069787 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.069796 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.069818 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.069825 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:39Z","lastTransitionTime":"2026-01-31T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.171361 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.171393 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.171402 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.171415 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.171423 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:39Z","lastTransitionTime":"2026-01-31T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.272607 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.272715 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.272779 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.272869 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.272935 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:39Z","lastTransitionTime":"2026-01-31T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.375146 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.375199 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.375211 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.375225 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.375234 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:39Z","lastTransitionTime":"2026-01-31T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.476591 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.476618 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.476626 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.476640 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.476649 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:39Z","lastTransitionTime":"2026-01-31T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.579003 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.579036 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.579047 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.579058 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.579067 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:39Z","lastTransitionTime":"2026-01-31T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.645236 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:39 crc kubenswrapper[4783]: E0131 09:06:39.645328 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.645352 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.645380 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.645396 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:39 crc kubenswrapper[4783]: E0131 09:06:39.645422 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:39 crc kubenswrapper[4783]: E0131 09:06:39.645485 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:39 crc kubenswrapper[4783]: E0131 09:06:39.645554 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.658753 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-99m9k" podStartSLOduration=87.658745025 podStartE2EDuration="1m27.658745025s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:39.658714977 +0000 UTC m=+110.327398446" watchObservedRunningTime="2026-01-31 09:06:39.658745025 +0000 UTC m=+110.327428493" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.668273 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podStartSLOduration=87.668266546 podStartE2EDuration="1m27.668266546s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:39.66749506 +0000 UTC m=+110.336178527" watchObservedRunningTime="2026-01-31 09:06:39.668266546 +0000 UTC m=+110.336950015" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.669198 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 01:05:50.003963915 +0000 UTC Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.681403 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.681467 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.681480 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.681503 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.681526 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:39Z","lastTransitionTime":"2026-01-31T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.692894 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mwdww" podStartSLOduration=87.692873496 podStartE2EDuration="1m27.692873496s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:39.691888875 +0000 UTC m=+110.360572353" watchObservedRunningTime="2026-01-31 09:06:39.692873496 +0000 UTC m=+110.361556963" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.754429 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=20.754406681 podStartE2EDuration="20.754406681s" podCreationTimestamp="2026-01-31 09:06:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:39.744563621 +0000 UTC m=+110.413247089" watchObservedRunningTime="2026-01-31 09:06:39.754406681 +0000 UTC m=+110.423090149" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.773731 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=63.77370616 podStartE2EDuration="1m3.77370616s" podCreationTimestamp="2026-01-31 09:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:39.762847932 +0000 UTC m=+110.431531389" watchObservedRunningTime="2026-01-31 09:06:39.77370616 +0000 UTC m=+110.442389628" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.774085 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=92.774079996 podStartE2EDuration="1m32.774079996s" podCreationTimestamp="2026-01-31 09:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:39.77363728 +0000 UTC m=+110.442320748" watchObservedRunningTime="2026-01-31 09:06:39.774079996 +0000 UTC m=+110.442763464" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.783909 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.783948 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.783960 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.783978 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.783966 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=89.783953444 podStartE2EDuration="1m29.783953444s" podCreationTimestamp="2026-01-31 09:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:39.783861791 +0000 UTC m=+110.452545258" watchObservedRunningTime="2026-01-31 09:06:39.783953444 +0000 UTC m=+110.452636911" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.783989 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:39Z","lastTransitionTime":"2026-01-31T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.801342 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6h2bb" podStartSLOduration=87.801321904 podStartE2EDuration="1m27.801321904s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:39.799717461 +0000 UTC m=+110.468400930" watchObservedRunningTime="2026-01-31 09:06:39.801321904 +0000 UTC m=+110.470005371" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.811667 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8q8td" podStartSLOduration=87.811651393 podStartE2EDuration="1m27.811651393s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:39.811537277 +0000 UTC m=+110.480220745" watchObservedRunningTime="2026-01-31 09:06:39.811651393 +0000 UTC m=+110.480334861" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.820566 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jslrd" podStartSLOduration=86.820550759 podStartE2EDuration="1m26.820550759s" podCreationTimestamp="2026-01-31 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:39.820502778 +0000 UTC m=+110.489186246" watchObservedRunningTime="2026-01-31 09:06:39.820550759 +0000 UTC m=+110.489234227" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.839911 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=91.839891055 podStartE2EDuration="1m31.839891055s" podCreationTimestamp="2026-01-31 09:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:39.839022965 +0000 UTC m=+110.507706443" watchObservedRunningTime="2026-01-31 09:06:39.839891055 +0000 UTC m=+110.508574523" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.886046 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.886078 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.886087 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.886101 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.886110 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:39Z","lastTransitionTime":"2026-01-31T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.987878 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.987933 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.987944 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.987957 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:39 crc kubenswrapper[4783]: I0131 09:06:39.987966 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:39Z","lastTransitionTime":"2026-01-31T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.089529 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.089568 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.089578 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.089593 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.089605 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:40Z","lastTransitionTime":"2026-01-31T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.191529 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.191557 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.191567 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.191595 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.191605 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:40Z","lastTransitionTime":"2026-01-31T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.293465 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.293516 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.293526 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.293539 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.293547 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:40Z","lastTransitionTime":"2026-01-31T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.395545 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.395572 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.395581 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.395590 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.395598 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:40Z","lastTransitionTime":"2026-01-31T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.497062 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.497089 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.497096 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.497105 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.497111 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:40Z","lastTransitionTime":"2026-01-31T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.598997 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.599034 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.599045 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.599058 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.599067 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:40Z","lastTransitionTime":"2026-01-31T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.669981 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 02:31:55.325436813 +0000 UTC Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.701015 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.701054 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.701061 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.701070 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.701077 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:40Z","lastTransitionTime":"2026-01-31T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.802680 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.802706 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.802714 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.802723 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.802730 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:40Z","lastTransitionTime":"2026-01-31T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.904547 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.904572 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.904579 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.904610 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:40 crc kubenswrapper[4783]: I0131 09:06:40.904618 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:40Z","lastTransitionTime":"2026-01-31T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.006632 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.006730 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.006749 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.006777 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.006795 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:41Z","lastTransitionTime":"2026-01-31T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.108400 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.108455 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.108472 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.108492 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.108505 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:41Z","lastTransitionTime":"2026-01-31T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.211432 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.211467 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.211476 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.211491 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.211499 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:41Z","lastTransitionTime":"2026-01-31T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.313624 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.313668 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.313679 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.313691 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.313698 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:41Z","lastTransitionTime":"2026-01-31T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.415233 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.415259 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.415268 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.415280 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.415291 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:41Z","lastTransitionTime":"2026-01-31T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.516748 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.516826 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.516839 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.516850 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.516858 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:41Z","lastTransitionTime":"2026-01-31T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.617884 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.617916 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.617925 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.617935 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.617943 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:41Z","lastTransitionTime":"2026-01-31T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.645381 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.645410 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.645460 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:41 crc kubenswrapper[4783]: E0131 09:06:41.645571 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.645600 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:41 crc kubenswrapper[4783]: E0131 09:06:41.645677 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:41 crc kubenswrapper[4783]: E0131 09:06:41.645815 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:41 crc kubenswrapper[4783]: E0131 09:06:41.645926 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.670654 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 03:19:16.840575556 +0000 UTC Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.719719 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.719743 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.719753 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.719763 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.719770 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:41Z","lastTransitionTime":"2026-01-31T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.821462 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.821487 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.821495 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.821511 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.821519 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:41Z","lastTransitionTime":"2026-01-31T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.923258 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.923291 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.923301 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.923317 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:41 crc kubenswrapper[4783]: I0131 09:06:41.923336 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:41Z","lastTransitionTime":"2026-01-31T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.024330 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.024359 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.024370 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.024382 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.024392 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:42Z","lastTransitionTime":"2026-01-31T09:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.126142 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.126192 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.126202 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.126211 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.126218 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:42Z","lastTransitionTime":"2026-01-31T09:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.228077 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.228203 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.228284 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.228356 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.228423 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:42Z","lastTransitionTime":"2026-01-31T09:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.330637 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.330744 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.330819 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.330907 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.330965 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:42Z","lastTransitionTime":"2026-01-31T09:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.432209 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.432233 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.432241 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.432255 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.432264 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:42Z","lastTransitionTime":"2026-01-31T09:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.460119 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.460241 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.460315 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.460383 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.460449 4783 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:06:42Z","lastTransitionTime":"2026-01-31T09:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.488567 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz"] Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.489265 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.490578 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.490674 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.490716 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.490867 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.574503 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ddfa337-eab6-43bb-9d9c-926c207b44d7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vg2jz\" (UID: \"3ddfa337-eab6-43bb-9d9c-926c207b44d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.574553 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ddfa337-eab6-43bb-9d9c-926c207b44d7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vg2jz\" (UID: \"3ddfa337-eab6-43bb-9d9c-926c207b44d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.574584 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ddfa337-eab6-43bb-9d9c-926c207b44d7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vg2jz\" (UID: \"3ddfa337-eab6-43bb-9d9c-926c207b44d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.574609 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3ddfa337-eab6-43bb-9d9c-926c207b44d7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vg2jz\" (UID: \"3ddfa337-eab6-43bb-9d9c-926c207b44d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.574649 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3ddfa337-eab6-43bb-9d9c-926c207b44d7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vg2jz\" (UID: \"3ddfa337-eab6-43bb-9d9c-926c207b44d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.645461 4783 scope.go:117] "RemoveContainer" containerID="7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.671036 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 05:51:51.891639382 +0000 UTC Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.671153 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.675428 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ddfa337-eab6-43bb-9d9c-926c207b44d7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vg2jz\" (UID: \"3ddfa337-eab6-43bb-9d9c-926c207b44d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.675452 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ddfa337-eab6-43bb-9d9c-926c207b44d7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vg2jz\" (UID: \"3ddfa337-eab6-43bb-9d9c-926c207b44d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.675497 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ddfa337-eab6-43bb-9d9c-926c207b44d7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vg2jz\" (UID: \"3ddfa337-eab6-43bb-9d9c-926c207b44d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.675518 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3ddfa337-eab6-43bb-9d9c-926c207b44d7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vg2jz\" (UID: \"3ddfa337-eab6-43bb-9d9c-926c207b44d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.675547 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3ddfa337-eab6-43bb-9d9c-926c207b44d7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vg2jz\" (UID: \"3ddfa337-eab6-43bb-9d9c-926c207b44d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.675896 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3ddfa337-eab6-43bb-9d9c-926c207b44d7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vg2jz\" (UID: \"3ddfa337-eab6-43bb-9d9c-926c207b44d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.675912 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3ddfa337-eab6-43bb-9d9c-926c207b44d7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vg2jz\" (UID: \"3ddfa337-eab6-43bb-9d9c-926c207b44d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.676265 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ddfa337-eab6-43bb-9d9c-926c207b44d7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vg2jz\" (UID: \"3ddfa337-eab6-43bb-9d9c-926c207b44d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.676382 4783 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.681842 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ddfa337-eab6-43bb-9d9c-926c207b44d7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vg2jz\" (UID: \"3ddfa337-eab6-43bb-9d9c-926c207b44d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.688376 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ddfa337-eab6-43bb-9d9c-926c207b44d7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vg2jz\" (UID: \"3ddfa337-eab6-43bb-9d9c-926c207b44d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: I0131 09:06:42.799999 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" Jan 31 09:06:42 crc kubenswrapper[4783]: W0131 09:06:42.809381 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ddfa337_eab6_43bb_9d9c_926c207b44d7.slice/crio-5885be89c7ba87f072621d45a7478c81401b2371549a4a6f96ce727ce7ee6916 WatchSource:0}: Error finding container 5885be89c7ba87f072621d45a7478c81401b2371549a4a6f96ce727ce7ee6916: Status 404 returned error can't find the container with id 5885be89c7ba87f072621d45a7478c81401b2371549a4a6f96ce727ce7ee6916 Jan 31 09:06:43 crc kubenswrapper[4783]: I0131 09:06:43.025027 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" event={"ID":"3ddfa337-eab6-43bb-9d9c-926c207b44d7","Type":"ContainerStarted","Data":"cd2cb903b55bee703dcf427545e2f7521f4cab44f4d57615971cdf4d62f71049"} Jan 31 09:06:43 crc kubenswrapper[4783]: I0131 09:06:43.025077 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" event={"ID":"3ddfa337-eab6-43bb-9d9c-926c207b44d7","Type":"ContainerStarted","Data":"5885be89c7ba87f072621d45a7478c81401b2371549a4a6f96ce727ce7ee6916"} Jan 31 09:06:43 crc kubenswrapper[4783]: I0131 09:06:43.027205 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovnkube-controller/3.log" Jan 31 09:06:43 crc kubenswrapper[4783]: I0131 09:06:43.029213 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerStarted","Data":"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f"} Jan 31 09:06:43 crc kubenswrapper[4783]: I0131 09:06:43.029513 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:06:43 crc kubenswrapper[4783]: I0131 09:06:43.035284 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vg2jz" podStartSLOduration=91.035275909 podStartE2EDuration="1m31.035275909s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:43.034939963 +0000 UTC m=+113.703623432" watchObservedRunningTime="2026-01-31 09:06:43.035275909 +0000 UTC m=+113.703959376" Jan 31 09:06:43 crc kubenswrapper[4783]: I0131 09:06:43.055287 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podStartSLOduration=91.055276282 podStartE2EDuration="1m31.055276282s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:43.054363056 +0000 UTC m=+113.723046524" watchObservedRunningTime="2026-01-31 09:06:43.055276282 +0000 UTC m=+113.723959750" Jan 31 09:06:43 crc kubenswrapper[4783]: I0131 09:06:43.271853 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xg6x2"] Jan 31 09:06:43 crc kubenswrapper[4783]: I0131 09:06:43.271970 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:43 crc kubenswrapper[4783]: E0131 09:06:43.272041 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:43 crc kubenswrapper[4783]: I0131 09:06:43.645588 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:43 crc kubenswrapper[4783]: I0131 09:06:43.645627 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:43 crc kubenswrapper[4783]: E0131 09:06:43.646019 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:43 crc kubenswrapper[4783]: I0131 09:06:43.645717 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:43 crc kubenswrapper[4783]: E0131 09:06:43.646134 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:43 crc kubenswrapper[4783]: E0131 09:06:43.646421 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.645008 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.645011 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.645806 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:45 crc kubenswrapper[4783]: E0131 09:06:45.645969 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.646114 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:45 crc kubenswrapper[4783]: E0131 09:06:45.646197 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xg6x2" podUID="84961ed7-35f8-4e6a-987c-cabb84cf7268" Jan 31 09:06:45 crc kubenswrapper[4783]: E0131 09:06:45.646255 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:06:45 crc kubenswrapper[4783]: E0131 09:06:45.646346 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.797507 4783 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.797582 4783 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.819879 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-skr4f"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.820230 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.821319 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7h6ff"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.821726 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.821957 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.822243 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.822946 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.823583 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.823682 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.823877 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.823947 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.823682 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.823879 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.824678 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.825000 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.826342 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.826786 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.826838 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.826863 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.826789 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.826963 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.827003 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.827341 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.827491 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.827549 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.827701 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.827775 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.827832 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.827932 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.827934 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.828090 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.828181 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.828218 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.828188 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.828286 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.828426 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fl86q"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.828880 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fl86q" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.829058 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.829066 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-m54sp"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.829460 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.829704 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pn4qw"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.829988 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.830251 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7csp2"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.830573 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.831129 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-xjwbp"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.831400 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.841224 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-9chh8"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.841766 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9chh8" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.846004 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.846911 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.846947 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.847059 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.847149 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.847217 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.847316 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.847373 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.847626 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6lht4"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.847873 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.847872 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.848076 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6lht4" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.848138 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.848425 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.848474 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.850275 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.850556 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.850755 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.850958 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.851230 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.851406 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.851455 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.851556 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.851656 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.851690 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.851771 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.851816 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.852028 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.859316 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.852040 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.850968 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.852081 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.852112 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.852147 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.852199 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.852242 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.858342 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.861722 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.862268 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.862541 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.862174 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.863628 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.865000 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.877753 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.877940 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.878114 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.878125 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.878145 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.878220 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.878272 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.879196 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.879712 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.880117 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.880425 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.881033 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.881293 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.881596 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.881734 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-skr4f"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.882016 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.882330 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.882659 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mx2nq"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.882995 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.883746 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.885006 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.887216 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.887913 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.888003 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.888181 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.888203 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7h6ff"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.888187 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.888694 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.889130 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.890210 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fl86q"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.890314 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-xjwbp"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.891911 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.892730 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.893081 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.893375 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.893584 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.893743 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.893761 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.893767 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.894052 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.894090 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.893981 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.894187 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.894365 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.894775 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.895010 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6lht4"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.895414 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.895916 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pn4qw"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.904578 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.905346 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr4sq\" (UniqueName: \"kubernetes.io/projected/700365a9-e7e5-413b-a980-42b9abcd61c7-kube-api-access-hr4sq\") pod \"console-operator-58897d9998-6lht4\" (UID: \"700365a9-e7e5-413b-a980-42b9abcd61c7\") " pod="openshift-console-operator/console-operator-58897d9998-6lht4" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.905394 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-serving-cert\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.905485 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-service-ca\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.905633 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-config\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.905662 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7gm7\" (UniqueName: \"kubernetes.io/projected/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-kube-api-access-c7gm7\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.905682 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/700365a9-e7e5-413b-a980-42b9abcd61c7-trusted-ca\") pod \"console-operator-58897d9998-6lht4\" (UID: \"700365a9-e7e5-413b-a980-42b9abcd61c7\") " pod="openshift-console-operator/console-operator-58897d9998-6lht4" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.905746 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-client-ca\") pod \"controller-manager-879f6c89f-skr4f\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.905769 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-serving-cert\") pod \"controller-manager-879f6c89f-skr4f\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.905795 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/700365a9-e7e5-413b-a980-42b9abcd61c7-serving-cert\") pod \"console-operator-58897d9998-6lht4\" (UID: \"700365a9-e7e5-413b-a980-42b9abcd61c7\") " pod="openshift-console-operator/console-operator-58897d9998-6lht4" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.905822 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-config\") pod \"controller-manager-879f6c89f-skr4f\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.905841 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/700365a9-e7e5-413b-a980-42b9abcd61c7-config\") pod \"console-operator-58897d9998-6lht4\" (UID: \"700365a9-e7e5-413b-a980-42b9abcd61c7\") " pod="openshift-console-operator/console-operator-58897d9998-6lht4" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.905869 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-oauth-config\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.905892 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-skr4f\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.905915 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt5m2\" (UniqueName: \"kubernetes.io/projected/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-kube-api-access-bt5m2\") pod \"controller-manager-879f6c89f-skr4f\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.905954 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-trusted-ca-bundle\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.905974 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-oauth-serving-cert\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.906069 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.913999 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.914825 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.915440 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.917465 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.917994 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ljfs4"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.918172 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.918472 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ljfs4" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.920216 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4qqm"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.920533 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-x8w2f"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.920758 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mv84q"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.921382 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4qqm" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.921571 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mv84q" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.921814 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.923396 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.923862 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.924949 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.925058 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9chh8"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.926312 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.926658 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.927387 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.927727 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.928855 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.929290 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.932063 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.932593 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.933995 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.934359 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.935477 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.937508 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sm9jn"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.937988 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sm9jn" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.939062 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.939422 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.941939 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.942054 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.942441 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-n9kcz"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.942779 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lkj9z"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.943092 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.943263 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.943390 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-n9kcz" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.943685 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-wb9bj"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.943994 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.944302 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.944321 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.945420 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.945996 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.947134 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.947617 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.947931 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lljb6"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.948471 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lljb6" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.948789 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.949614 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7csp2"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.950714 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ljfs4"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.951452 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.951899 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.953230 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.954916 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-x8w2f"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.955809 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.956431 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.962189 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mx2nq"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.963553 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.965778 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sm9jn"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.967968 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-m54sp"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.972521 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lkj9z"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.973430 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.974279 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.975393 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4qqm"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.975944 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mv84q"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.976799 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.977647 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.978448 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-n9kcz"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.979686 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.980060 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.980899 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.981743 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.982051 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.982603 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.983436 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.984273 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.985077 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.985920 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-88hqp"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.986543 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-88hqp" Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.986756 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-88hqp"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.987651 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lljb6"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.988310 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-5kcff"] Jan 31 09:06:45 crc kubenswrapper[4783]: I0131 09:06:45.988804 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5kcff" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.002319 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006533 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-config\") pod \"controller-manager-879f6c89f-skr4f\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006562 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006602 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-oauth-config\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006621 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/700365a9-e7e5-413b-a980-42b9abcd61c7-config\") pod \"console-operator-58897d9998-6lht4\" (UID: \"700365a9-e7e5-413b-a980-42b9abcd61c7\") " pod="openshift-console-operator/console-operator-58897d9998-6lht4" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006636 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e091acc-49d9-4782-b82a-d71e6f276dce-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-t6nkc\" (UID: \"6e091acc-49d9-4782-b82a-d71e6f276dce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006653 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-skr4f\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006689 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-964z2\" (UniqueName: \"kubernetes.io/projected/62344451-2a07-4504-833b-de06393277f2-kube-api-access-964z2\") pod \"downloads-7954f5f757-9chh8\" (UID: \"62344451-2a07-4504-833b-de06393277f2\") " pod="openshift-console/downloads-7954f5f757-9chh8" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006707 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt5m2\" (UniqueName: \"kubernetes.io/projected/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-kube-api-access-bt5m2\") pod \"controller-manager-879f6c89f-skr4f\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006724 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-trusted-ca-bundle\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006761 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-oauth-serving-cert\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006801 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-audit-dir\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006827 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr4sq\" (UniqueName: \"kubernetes.io/projected/700365a9-e7e5-413b-a980-42b9abcd61c7-kube-api-access-hr4sq\") pod \"console-operator-58897d9998-6lht4\" (UID: \"700365a9-e7e5-413b-a980-42b9abcd61c7\") " pod="openshift-console-operator/console-operator-58897d9998-6lht4" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006865 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-serving-cert\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006895 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4g6d\" (UniqueName: \"kubernetes.io/projected/6e091acc-49d9-4782-b82a-d71e6f276dce-kube-api-access-s4g6d\") pod \"openshift-apiserver-operator-796bbdcf4f-t6nkc\" (UID: \"6e091acc-49d9-4782-b82a-d71e6f276dce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006920 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-serving-cert\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006937 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-service-ca\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006962 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-encryption-config\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.006986 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-config\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.007002 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7gm7\" (UniqueName: \"kubernetes.io/projected/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-kube-api-access-c7gm7\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.007017 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/700365a9-e7e5-413b-a980-42b9abcd61c7-trusted-ca\") pod \"console-operator-58897d9998-6lht4\" (UID: \"700365a9-e7e5-413b-a980-42b9abcd61c7\") " pod="openshift-console-operator/console-operator-58897d9998-6lht4" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.007040 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.007067 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/63366f9c-1f0c-4f9c-ae86-3298cb4274f9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fl86q\" (UID: \"63366f9c-1f0c-4f9c-ae86-3298cb4274f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fl86q" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.007084 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d88wj\" (UniqueName: \"kubernetes.io/projected/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-kube-api-access-d88wj\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.007100 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-serving-cert\") pod \"controller-manager-879f6c89f-skr4f\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.007115 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-client-ca\") pod \"controller-manager-879f6c89f-skr4f\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.007130 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e091acc-49d9-4782-b82a-d71e6f276dce-config\") pod \"openshift-apiserver-operator-796bbdcf4f-t6nkc\" (UID: \"6e091acc-49d9-4782-b82a-d71e6f276dce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.007144 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg7fh\" (UniqueName: \"kubernetes.io/projected/63366f9c-1f0c-4f9c-ae86-3298cb4274f9-kube-api-access-vg7fh\") pod \"cluster-samples-operator-665b6dd947-fl86q\" (UID: \"63366f9c-1f0c-4f9c-ae86-3298cb4274f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fl86q" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.007177 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/700365a9-e7e5-413b-a980-42b9abcd61c7-serving-cert\") pod \"console-operator-58897d9998-6lht4\" (UID: \"700365a9-e7e5-413b-a980-42b9abcd61c7\") " pod="openshift-console-operator/console-operator-58897d9998-6lht4" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.007195 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-etcd-client\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.007209 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-audit-policies\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.007526 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/700365a9-e7e5-413b-a980-42b9abcd61c7-config\") pod \"console-operator-58897d9998-6lht4\" (UID: \"700365a9-e7e5-413b-a980-42b9abcd61c7\") " pod="openshift-console-operator/console-operator-58897d9998-6lht4" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.007735 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-oauth-serving-cert\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.007988 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-trusted-ca-bundle\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.008000 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-skr4f\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.008460 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-config\") pod \"controller-manager-879f6c89f-skr4f\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.008567 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-service-ca\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.008659 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/700365a9-e7e5-413b-a980-42b9abcd61c7-trusted-ca\") pod \"console-operator-58897d9998-6lht4\" (UID: \"700365a9-e7e5-413b-a980-42b9abcd61c7\") " pod="openshift-console-operator/console-operator-58897d9998-6lht4" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.008737 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-config\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.008912 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-client-ca\") pod \"controller-manager-879f6c89f-skr4f\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.011462 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-oauth-config\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.011571 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/700365a9-e7e5-413b-a980-42b9abcd61c7-serving-cert\") pod \"console-operator-58897d9998-6lht4\" (UID: \"700365a9-e7e5-413b-a980-42b9abcd61c7\") " pod="openshift-console-operator/console-operator-58897d9998-6lht4" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.012009 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-serving-cert\") pod \"controller-manager-879f6c89f-skr4f\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.012633 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-serving-cert\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.031120 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-brrd5"] Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.032249 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.038437 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-brrd5"] Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.041857 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.062147 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.081337 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.101246 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.107881 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e091acc-49d9-4782-b82a-d71e6f276dce-config\") pod \"openshift-apiserver-operator-796bbdcf4f-t6nkc\" (UID: \"6e091acc-49d9-4782-b82a-d71e6f276dce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.107916 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg7fh\" (UniqueName: \"kubernetes.io/projected/63366f9c-1f0c-4f9c-ae86-3298cb4274f9-kube-api-access-vg7fh\") pod \"cluster-samples-operator-665b6dd947-fl86q\" (UID: \"63366f9c-1f0c-4f9c-ae86-3298cb4274f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fl86q" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.107942 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-etcd-client\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.107959 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-audit-policies\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.107975 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.107996 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e091acc-49d9-4782-b82a-d71e6f276dce-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-t6nkc\" (UID: \"6e091acc-49d9-4782-b82a-d71e6f276dce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.108025 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-964z2\" (UniqueName: \"kubernetes.io/projected/62344451-2a07-4504-833b-de06393277f2-kube-api-access-964z2\") pod \"downloads-7954f5f757-9chh8\" (UID: \"62344451-2a07-4504-833b-de06393277f2\") " pod="openshift-console/downloads-7954f5f757-9chh8" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.108051 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-audit-dir\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.108084 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4g6d\" (UniqueName: \"kubernetes.io/projected/6e091acc-49d9-4782-b82a-d71e6f276dce-kube-api-access-s4g6d\") pod \"openshift-apiserver-operator-796bbdcf4f-t6nkc\" (UID: \"6e091acc-49d9-4782-b82a-d71e6f276dce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.108099 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-serving-cert\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.108121 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-encryption-config\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.108150 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.108201 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/63366f9c-1f0c-4f9c-ae86-3298cb4274f9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fl86q\" (UID: \"63366f9c-1f0c-4f9c-ae86-3298cb4274f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fl86q" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.108219 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d88wj\" (UniqueName: \"kubernetes.io/projected/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-kube-api-access-d88wj\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.108528 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-audit-dir\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.108870 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.108909 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e091acc-49d9-4782-b82a-d71e6f276dce-config\") pod \"openshift-apiserver-operator-796bbdcf4f-t6nkc\" (UID: \"6e091acc-49d9-4782-b82a-d71e6f276dce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.109008 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.109122 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-audit-policies\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.110318 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-etcd-client\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.110564 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/63366f9c-1f0c-4f9c-ae86-3298cb4274f9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fl86q\" (UID: \"63366f9c-1f0c-4f9c-ae86-3298cb4274f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fl86q" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.111216 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e091acc-49d9-4782-b82a-d71e6f276dce-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-t6nkc\" (UID: \"6e091acc-49d9-4782-b82a-d71e6f276dce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.111420 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-encryption-config\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.111602 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-serving-cert\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.141448 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.161564 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.182042 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.202038 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.221939 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.241947 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.262348 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.281992 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.301586 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.321699 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.342218 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.362706 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.381817 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.401993 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.422189 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.441722 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.462127 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.482387 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.502197 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.521787 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.542018 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.562289 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.581324 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.601587 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.621778 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.641909 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.662147 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.682341 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.701745 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.721814 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.741790 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.762350 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.782401 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.802241 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.822324 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.841474 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.861714 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.881762 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.901558 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.921866 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.940925 4783 request.go:700] Waited for 1.002603209s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.941792 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.962093 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 31 09:06:46 crc kubenswrapper[4783]: I0131 09:06:46.982339 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.001384 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.021512 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.041280 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.066321 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.082333 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.102502 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.121478 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.141340 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.162409 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.181339 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.201692 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.221676 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.241688 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.261725 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.281925 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.301947 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.321887 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.341391 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.361532 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.381570 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.401492 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.421549 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.441612 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.461592 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.481490 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.501661 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.521756 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.542157 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.561843 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.581844 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.607637 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.622481 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.641617 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.644762 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.644766 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.644814 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.644829 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.661448 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.681489 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.702208 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.722143 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.742244 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.761461 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.781984 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.802529 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.833881 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt5m2\" (UniqueName: \"kubernetes.io/projected/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-kube-api-access-bt5m2\") pod \"controller-manager-879f6c89f-skr4f\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.852640 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr4sq\" (UniqueName: \"kubernetes.io/projected/700365a9-e7e5-413b-a980-42b9abcd61c7-kube-api-access-hr4sq\") pod \"console-operator-58897d9998-6lht4\" (UID: \"700365a9-e7e5-413b-a980-42b9abcd61c7\") " pod="openshift-console-operator/console-operator-58897d9998-6lht4" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.872722 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7gm7\" (UniqueName: \"kubernetes.io/projected/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-kube-api-access-c7gm7\") pod \"console-f9d7485db-xjwbp\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.901621 4783 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.922273 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.938016 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.941748 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.960602 4783 request.go:700] Waited for 1.852383567s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.972868 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg7fh\" (UniqueName: \"kubernetes.io/projected/63366f9c-1f0c-4f9c-ae86-3298cb4274f9-kube-api-access-vg7fh\") pod \"cluster-samples-operator-665b6dd947-fl86q\" (UID: \"63366f9c-1f0c-4f9c-ae86-3298cb4274f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fl86q" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.983981 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fl86q" Jan 31 09:06:47 crc kubenswrapper[4783]: I0131 09:06:47.995507 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4g6d\" (UniqueName: \"kubernetes.io/projected/6e091acc-49d9-4782-b82a-d71e6f276dce-kube-api-access-s4g6d\") pod \"openshift-apiserver-operator-796bbdcf4f-t6nkc\" (UID: \"6e091acc-49d9-4782-b82a-d71e6f276dce\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.015123 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d88wj\" (UniqueName: \"kubernetes.io/projected/0aa3bf42-302f-4ad1-9a65-d2c878e957a6-kube-api-access-d88wj\") pod \"apiserver-7bbb656c7d-jtdqd\" (UID: \"0aa3bf42-302f-4ad1-9a65-d2c878e957a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.033470 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.035033 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-964z2\" (UniqueName: \"kubernetes.io/projected/62344451-2a07-4504-833b-de06393277f2-kube-api-access-964z2\") pod \"downloads-7954f5f757-9chh8\" (UID: \"62344451-2a07-4504-833b-de06393277f2\") " pod="openshift-console/downloads-7954f5f757-9chh8" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.042309 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9chh8" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.048713 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6lht4" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.054728 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-skr4f"] Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.054884 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.062696 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.077287 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.082086 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.101493 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.102979 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fl86q"] Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.124448 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128640 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-trusted-ca\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128668 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfs2w\" (UniqueName: \"kubernetes.io/projected/c7522dc2-2021-4ca3-8ece-f051f1149e61-kube-api-access-gfs2w\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128687 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128704 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnblf\" (UniqueName: \"kubernetes.io/projected/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-kube-api-access-mnblf\") pod \"route-controller-manager-6576b87f9c-9m5db\" (UID: \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128720 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9087aa4-944f-458e-9dfc-0d2e1ee1246e-serving-cert\") pod \"openshift-config-operator-7777fb866f-wqsnf\" (UID: \"a9087aa4-944f-458e-9dfc-0d2e1ee1246e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128734 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7522dc2-2021-4ca3-8ece-f051f1149e61-config\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128747 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-client-ca\") pod \"route-controller-manager-6576b87f9c-9m5db\" (UID: \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128761 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x6n4\" (UniqueName: \"kubernetes.io/projected/92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1-kube-api-access-9x6n4\") pod \"openshift-controller-manager-operator-756b6f6bc6-7kn7h\" (UID: \"92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128774 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c7522dc2-2021-4ca3-8ece-f051f1149e61-etcd-client\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128787 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-audit-policies\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128819 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-registry-tls\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128831 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e762bee-f41c-4af8-96ac-543b92f1f983-serving-cert\") pod \"authentication-operator-69f744f599-pn4qw\" (UID: \"6e762bee-f41c-4af8-96ac-543b92f1f983\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128844 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a9087aa4-944f-458e-9dfc-0d2e1ee1246e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wqsnf\" (UID: \"a9087aa4-944f-458e-9dfc-0d2e1ee1246e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128856 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7kn7h\" (UID: \"92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128880 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7522dc2-2021-4ca3-8ece-f051f1149e61-serving-cert\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128895 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkc67\" (UniqueName: \"kubernetes.io/projected/ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157-kube-api-access-qkc67\") pod \"machine-approver-56656f9798-mlb7b\" (UID: \"ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128911 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157-machine-approver-tls\") pod \"machine-approver-56656f9798-mlb7b\" (UID: \"ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128925 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9281391-bb2a-40e2-ba91-bb6892bd888f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gxn2x\" (UID: \"f9281391-bb2a-40e2-ba91-bb6892bd888f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128946 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128959 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c7522dc2-2021-4ca3-8ece-f051f1149e61-audit\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128973 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-config\") pod \"route-controller-manager-6576b87f9c-9m5db\" (UID: \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.128988 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157-auth-proxy-config\") pod \"machine-approver-56656f9798-mlb7b\" (UID: \"ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129002 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfffq\" (UniqueName: \"kubernetes.io/projected/a9087aa4-944f-458e-9dfc-0d2e1ee1246e-kube-api-access-zfffq\") pod \"openshift-config-operator-7777fb866f-wqsnf\" (UID: \"a9087aa4-944f-458e-9dfc-0d2e1ee1246e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129014 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c7522dc2-2021-4ca3-8ece-f051f1149e61-image-import-ca\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129030 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129043 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-bound-sa-token\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129058 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e337b91-adb8-4cb7-8e5e-be2b80e78f56-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-m54sp\" (UID: \"0e337b91-adb8-4cb7-8e5e-be2b80e78f56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129071 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c7522dc2-2021-4ca3-8ece-f051f1149e61-audit-dir\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129085 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129100 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129114 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-registry-certificates\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129126 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e762bee-f41c-4af8-96ac-543b92f1f983-service-ca-bundle\") pod \"authentication-operator-69f744f599-pn4qw\" (UID: \"6e762bee-f41c-4af8-96ac-543b92f1f983\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129141 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129155 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157-config\") pod \"machine-approver-56656f9798-mlb7b\" (UID: \"ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129192 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3101bbf9-84d2-42a0-a530-516f06015a0c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wxdg7\" (UID: \"3101bbf9-84d2-42a0-a530-516f06015a0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129206 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c7522dc2-2021-4ca3-8ece-f051f1149e61-etcd-serving-ca\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: E0131 09:06:48.129221 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:48.629208664 +0000 UTC m=+119.297892132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129251 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e762bee-f41c-4af8-96ac-543b92f1f983-config\") pod \"authentication-operator-69f744f599-pn4qw\" (UID: \"6e762bee-f41c-4af8-96ac-543b92f1f983\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129271 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fjks\" (UniqueName: \"kubernetes.io/projected/6e762bee-f41c-4af8-96ac-543b92f1f983-kube-api-access-9fjks\") pod \"authentication-operator-69f744f599-pn4qw\" (UID: \"6e762bee-f41c-4af8-96ac-543b92f1f983\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129298 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129312 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129325 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj98b\" (UniqueName: \"kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-kube-api-access-gj98b\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129338 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e337b91-adb8-4cb7-8e5e-be2b80e78f56-config\") pod \"machine-api-operator-5694c8668f-m54sp\" (UID: \"0e337b91-adb8-4cb7-8e5e-be2b80e78f56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129356 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7kn7h\" (UID: \"92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129371 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129384 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3101bbf9-84d2-42a0-a530-516f06015a0c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wxdg7\" (UID: \"3101bbf9-84d2-42a0-a530-516f06015a0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129397 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c7522dc2-2021-4ca3-8ece-f051f1149e61-encryption-config\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129419 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129487 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129508 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129523 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129572 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3101bbf9-84d2-42a0-a530-516f06015a0c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wxdg7\" (UID: \"3101bbf9-84d2-42a0-a530-516f06015a0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129590 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0e337b91-adb8-4cb7-8e5e-be2b80e78f56-images\") pod \"machine-api-operator-5694c8668f-m54sp\" (UID: \"0e337b91-adb8-4cb7-8e5e-be2b80e78f56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129606 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6j67\" (UniqueName: \"kubernetes.io/projected/0e337b91-adb8-4cb7-8e5e-be2b80e78f56-kube-api-access-l6j67\") pod \"machine-api-operator-5694c8668f-m54sp\" (UID: \"0e337b91-adb8-4cb7-8e5e-be2b80e78f56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129642 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f100d6ab-c3b2-4712-b2d3-370287baadb4-audit-dir\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129669 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e762bee-f41c-4af8-96ac-543b92f1f983-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pn4qw\" (UID: \"6e762bee-f41c-4af8-96ac-543b92f1f983\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129685 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7522dc2-2021-4ca3-8ece-f051f1149e61-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129705 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c7522dc2-2021-4ca3-8ece-f051f1149e61-node-pullsecrets\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129747 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129787 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-serving-cert\") pod \"route-controller-manager-6576b87f9c-9m5db\" (UID: \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129802 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9281391-bb2a-40e2-ba91-bb6892bd888f-config\") pod \"kube-apiserver-operator-766d6c64bb-gxn2x\" (UID: \"f9281391-bb2a-40e2-ba91-bb6892bd888f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129823 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sdf2\" (UniqueName: \"kubernetes.io/projected/f100d6ab-c3b2-4712-b2d3-370287baadb4-kube-api-access-4sdf2\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129845 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4dzb\" (UniqueName: \"kubernetes.io/projected/3101bbf9-84d2-42a0-a530-516f06015a0c-kube-api-access-c4dzb\") pod \"cluster-image-registry-operator-dc59b4c8b-wxdg7\" (UID: \"3101bbf9-84d2-42a0-a530-516f06015a0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.129873 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9281391-bb2a-40e2-ba91-bb6892bd888f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gxn2x\" (UID: \"f9281391-bb2a-40e2-ba91-bb6892bd888f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.142149 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.162183 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231044 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231193 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee769697-49da-45f8-824a-583871b70123-cert\") pod \"ingress-canary-88hqp\" (UID: \"ee769697-49da-45f8-824a-583871b70123\") " pod="openshift-ingress-canary/ingress-canary-88hqp" Jan 31 09:06:48 crc kubenswrapper[4783]: E0131 09:06:48.231226 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:48.731205061 +0000 UTC m=+119.399888518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231265 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfs2w\" (UniqueName: \"kubernetes.io/projected/c7522dc2-2021-4ca3-8ece-f051f1149e61-kube-api-access-gfs2w\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231294 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231311 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7522dc2-2021-4ca3-8ece-f051f1149e61-config\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231327 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnblf\" (UniqueName: \"kubernetes.io/projected/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-kube-api-access-mnblf\") pod \"route-controller-manager-6576b87f9c-9m5db\" (UID: \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231346 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtvfk\" (UniqueName: \"kubernetes.io/projected/a57e8066-1fe2-4664-9415-fa8a8b6621c5-kube-api-access-mtvfk\") pod \"dns-default-sm9jn\" (UID: \"a57e8066-1fe2-4664-9415-fa8a8b6621c5\") " pod="openshift-dns/dns-default-sm9jn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231363 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85df2\" (UniqueName: \"kubernetes.io/projected/a3f564d8-5f07-446d-9dd1-955e39d4a5f4-kube-api-access-85df2\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4qqm\" (UID: \"a3f564d8-5f07-446d-9dd1-955e39d4a5f4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4qqm" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231377 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f-images\") pod \"machine-config-operator-74547568cd-gf2tf\" (UID: \"06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231393 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c7522dc2-2021-4ca3-8ece-f051f1149e61-etcd-client\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231408 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-audit-policies\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231424 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7cf20dc6-2184-41b1-a943-f917dafb36b4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lkj9z\" (UID: \"7cf20dc6-2184-41b1-a943-f917dafb36b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231440 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a9087aa4-944f-458e-9dfc-0d2e1ee1246e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wqsnf\" (UID: \"a9087aa4-944f-458e-9dfc-0d2e1ee1246e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231469 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e2e303d-86eb-4b59-bfde-b8bfccb3ae65-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lnzfn\" (UID: \"6e2e303d-86eb-4b59-bfde-b8bfccb3ae65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231485 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7kn7h\" (UID: \"92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231498 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e2e303d-86eb-4b59-bfde-b8bfccb3ae65-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lnzfn\" (UID: \"6e2e303d-86eb-4b59-bfde-b8bfccb3ae65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231512 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c7522dc2-2021-4ca3-8ece-f051f1149e61-audit\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231528 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157-machine-approver-tls\") pod \"machine-approver-56656f9798-mlb7b\" (UID: \"ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231543 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/006ff35e-de43-4a64-bae9-f57f78f8d389-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k7m9b\" (UID: \"006ff35e-de43-4a64-bae9-f57f78f8d389\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231560 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfffq\" (UniqueName: \"kubernetes.io/projected/a9087aa4-944f-458e-9dfc-0d2e1ee1246e-kube-api-access-zfffq\") pod \"openshift-config-operator-7777fb866f-wqsnf\" (UID: \"a9087aa4-944f-458e-9dfc-0d2e1ee1246e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231575 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-plugins-dir\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231590 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gf2tf\" (UID: \"06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231605 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231618 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a57e8066-1fe2-4664-9415-fa8a8b6621c5-metrics-tls\") pod \"dns-default-sm9jn\" (UID: \"a57e8066-1fe2-4664-9415-fa8a8b6621c5\") " pod="openshift-dns/dns-default-sm9jn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231635 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e337b91-adb8-4cb7-8e5e-be2b80e78f56-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-m54sp\" (UID: \"0e337b91-adb8-4cb7-8e5e-be2b80e78f56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231652 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c7522dc2-2021-4ca3-8ece-f051f1149e61-audit-dir\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231665 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231681 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231694 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ed58079c-95e2-4931-a94e-ec9e1a65bbc0-node-bootstrap-token\") pod \"machine-config-server-5kcff\" (UID: \"ed58079c-95e2-4931-a94e-ec9e1a65bbc0\") " pod="openshift-machine-config-operator/machine-config-server-5kcff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231709 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231724 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk99d\" (UniqueName: \"kubernetes.io/projected/593183b0-1bb3-42d1-8949-d1b56f0ac114-kube-api-access-kk99d\") pod \"olm-operator-6b444d44fb-55h6m\" (UID: \"593183b0-1bb3-42d1-8949-d1b56f0ac114\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231739 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e762bee-f41c-4af8-96ac-543b92f1f983-service-ca-bundle\") pod \"authentication-operator-69f744f599-pn4qw\" (UID: \"6e762bee-f41c-4af8-96ac-543b92f1f983\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231753 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157-config\") pod \"machine-approver-56656f9798-mlb7b\" (UID: \"ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231766 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a57e8066-1fe2-4664-9415-fa8a8b6621c5-config-volume\") pod \"dns-default-sm9jn\" (UID: \"a57e8066-1fe2-4664-9415-fa8a8b6621c5\") " pod="openshift-dns/dns-default-sm9jn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231780 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66lpq\" (UniqueName: \"kubernetes.io/projected/ed58079c-95e2-4931-a94e-ec9e1a65bbc0-kube-api-access-66lpq\") pod \"machine-config-server-5kcff\" (UID: \"ed58079c-95e2-4931-a94e-ec9e1a65bbc0\") " pod="openshift-machine-config-operator/machine-config-server-5kcff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231795 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3101bbf9-84d2-42a0-a530-516f06015a0c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wxdg7\" (UID: \"3101bbf9-84d2-42a0-a530-516f06015a0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231808 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c7522dc2-2021-4ca3-8ece-f051f1149e61-etcd-serving-ca\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231822 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj98b\" (UniqueName: \"kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-kube-api-access-gj98b\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231836 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/40a6c5ba-0bc2-4e23-b92b-8486e77001ae-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-n9kcz\" (UID: \"40a6c5ba-0bc2-4e23-b92b-8486e77001ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n9kcz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231851 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/006ff35e-de43-4a64-bae9-f57f78f8d389-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k7m9b\" (UID: \"006ff35e-de43-4a64-bae9-f57f78f8d389\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231875 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/543641a4-6d2d-437f-93a5-478579e0622f-serving-cert\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231893 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231906 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/579c1ccd-dc05-4543-9f3e-9e82915896dc-metrics-tls\") pod \"dns-operator-744455d44c-mv84q\" (UID: \"579c1ccd-dc05-4543-9f3e-9e82915896dc\") " pod="openshift-dns-operator/dns-operator-744455d44c-mv84q" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231922 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7kn7h\" (UID: \"92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231937 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7cf20dc6-2184-41b1-a943-f917dafb36b4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lkj9z\" (UID: \"7cf20dc6-2184-41b1-a943-f917dafb36b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231953 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c7522dc2-2021-4ca3-8ece-f051f1149e61-encryption-config\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231967 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9c24edcb-aeef-44a1-99b6-9e7904c41253-stats-auth\") pod \"router-default-5444994796-wb9bj\" (UID: \"9c24edcb-aeef-44a1-99b6-9e7904c41253\") " pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231983 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/44c20261-8b8d-4fe3-9ae7-a07a46eafac8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hcbdl\" (UID: \"44c20261-8b8d-4fe3-9ae7-a07a46eafac8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.231997 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4b11032c-d124-4cd4-a810-90b94f3755cc-signing-key\") pod \"service-ca-9c57cc56f-ljfs4\" (UID: \"4b11032c-d124-4cd4-a810-90b94f3755cc\") " pod="openshift-service-ca/service-ca-9c57cc56f-ljfs4" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232011 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f01828fd-fba3-487c-a2c6-f5599e1c379d-config-volume\") pod \"collect-profiles-29497500-ljz8g\" (UID: \"f01828fd-fba3-487c-a2c6-f5599e1c379d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232027 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9c9g\" (UniqueName: \"kubernetes.io/projected/40a6c5ba-0bc2-4e23-b92b-8486e77001ae-kube-api-access-m9c9g\") pod \"multus-admission-controller-857f4d67dd-n9kcz\" (UID: \"40a6c5ba-0bc2-4e23-b92b-8486e77001ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n9kcz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232066 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g84sk\" (UniqueName: \"kubernetes.io/projected/44c20261-8b8d-4fe3-9ae7-a07a46eafac8-kube-api-access-g84sk\") pod \"package-server-manager-789f6589d5-hcbdl\" (UID: \"44c20261-8b8d-4fe3-9ae7-a07a46eafac8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232081 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232094 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqwhx\" (UniqueName: \"kubernetes.io/projected/06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f-kube-api-access-xqwhx\") pod \"machine-config-operator-74547568cd-gf2tf\" (UID: \"06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232115 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3101bbf9-84d2-42a0-a530-516f06015a0c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wxdg7\" (UID: \"3101bbf9-84d2-42a0-a530-516f06015a0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232148 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f100d6ab-c3b2-4712-b2d3-370287baadb4-audit-dir\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232184 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7522dc2-2021-4ca3-8ece-f051f1149e61-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232202 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e762bee-f41c-4af8-96ac-543b92f1f983-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pn4qw\" (UID: \"6e762bee-f41c-4af8-96ac-543b92f1f983\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232225 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c7522dc2-2021-4ca3-8ece-f051f1149e61-node-pullsecrets\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232240 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjqxb\" (UniqueName: \"kubernetes.io/projected/cbbd545d-5fde-4209-84e8-9252737745c4-kube-api-access-gjqxb\") pod \"packageserver-d55dfcdfc-g68tl\" (UID: \"cbbd545d-5fde-4209-84e8-9252737745c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232254 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9c24edcb-aeef-44a1-99b6-9e7904c41253-default-certificate\") pod \"router-default-5444994796-wb9bj\" (UID: \"9c24edcb-aeef-44a1-99b6-9e7904c41253\") " pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232268 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xngp\" (UniqueName: \"kubernetes.io/projected/4b11032c-d124-4cd4-a810-90b94f3755cc-kube-api-access-4xngp\") pod \"service-ca-9c57cc56f-ljfs4\" (UID: \"4b11032c-d124-4cd4-a810-90b94f3755cc\") " pod="openshift-service-ca/service-ca-9c57cc56f-ljfs4" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232283 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9281391-bb2a-40e2-ba91-bb6892bd888f-config\") pod \"kube-apiserver-operator-766d6c64bb-gxn2x\" (UID: \"f9281391-bb2a-40e2-ba91-bb6892bd888f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232297 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-serving-cert\") pod \"route-controller-manager-6576b87f9c-9m5db\" (UID: \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232312 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/752260dc-3c62-401d-90a9-3eb60ad4b8fa-srv-cert\") pod \"catalog-operator-68c6474976-774bh\" (UID: \"752260dc-3c62-401d-90a9-3eb60ad4b8fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232329 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3f564d8-5f07-446d-9dd1-955e39d4a5f4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4qqm\" (UID: \"a3f564d8-5f07-446d-9dd1-955e39d4a5f4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4qqm" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232344 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94e21399-452d-4820-8683-6536189c56c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4n7fz\" (UID: \"94e21399-452d-4820-8683-6536189c56c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232360 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m8ns\" (UniqueName: \"kubernetes.io/projected/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-kube-api-access-5m8ns\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232373 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/752260dc-3c62-401d-90a9-3eb60ad4b8fa-profile-collector-cert\") pod \"catalog-operator-68c6474976-774bh\" (UID: \"752260dc-3c62-401d-90a9-3eb60ad4b8fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232386 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94e21399-452d-4820-8683-6536189c56c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-4n7fz\" (UID: \"94e21399-452d-4820-8683-6536189c56c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232399 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9343e0fc-ae90-490e-9f5a-eb1668c75226-proxy-tls\") pod \"machine-config-controller-84d6567774-6pvst\" (UID: \"9343e0fc-ae90-490e-9f5a-eb1668c75226\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232404 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232427 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-trusted-ca\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232441 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94e21399-452d-4820-8683-6536189c56c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-4n7fz\" (UID: \"94e21399-452d-4820-8683-6536189c56c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232458 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9087aa4-944f-458e-9dfc-0d2e1ee1246e-serving-cert\") pod \"openshift-config-operator-7777fb866f-wqsnf\" (UID: \"a9087aa4-944f-458e-9dfc-0d2e1ee1246e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232473 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ded8f26-8d73-4a0e-a976-b3b4169a0c04-serving-cert\") pod \"service-ca-operator-777779d784-w5fjp\" (UID: \"4ded8f26-8d73-4a0e-a976-b3b4169a0c04\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232495 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-client-ca\") pod \"route-controller-manager-6576b87f9c-9m5db\" (UID: \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232510 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x6n4\" (UniqueName: \"kubernetes.io/projected/92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1-kube-api-access-9x6n4\") pod \"openshift-controller-manager-operator-756b6f6bc6-7kn7h\" (UID: \"92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232523 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ed58079c-95e2-4931-a94e-ec9e1a65bbc0-certs\") pod \"machine-config-server-5kcff\" (UID: \"ed58079c-95e2-4931-a94e-ec9e1a65bbc0\") " pod="openshift-machine-config-operator/machine-config-server-5kcff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232548 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ded8f26-8d73-4a0e-a976-b3b4169a0c04-config\") pod \"service-ca-operator-777779d784-w5fjp\" (UID: \"4ded8f26-8d73-4a0e-a976-b3b4169a0c04\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232559 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7522dc2-2021-4ca3-8ece-f051f1149e61-config\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232563 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjn74\" (UniqueName: \"kubernetes.io/projected/4c28e17f-e541-4f08-88cf-0bd130b756cc-kube-api-access-sjn74\") pod \"migrator-59844c95c7-lljb6\" (UID: \"4c28e17f-e541-4f08-88cf-0bd130b756cc\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lljb6" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232627 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4std2\" (UniqueName: \"kubernetes.io/projected/7cf20dc6-2184-41b1-a943-f917dafb36b4-kube-api-access-4std2\") pod \"marketplace-operator-79b997595-lkj9z\" (UID: \"7cf20dc6-2184-41b1-a943-f917dafb36b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232650 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-registry-tls\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232668 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e762bee-f41c-4af8-96ac-543b92f1f983-serving-cert\") pod \"authentication-operator-69f744f599-pn4qw\" (UID: \"6e762bee-f41c-4af8-96ac-543b92f1f983\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232683 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f-proxy-tls\") pod \"machine-config-operator-74547568cd-gf2tf\" (UID: \"06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232739 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7522dc2-2021-4ca3-8ece-f051f1149e61-serving-cert\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232759 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkc67\" (UniqueName: \"kubernetes.io/projected/ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157-kube-api-access-qkc67\") pod \"machine-approver-56656f9798-mlb7b\" (UID: \"ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232777 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k42nt\" (UniqueName: \"kubernetes.io/projected/ee769697-49da-45f8-824a-583871b70123-kube-api-access-k42nt\") pod \"ingress-canary-88hqp\" (UID: \"ee769697-49da-45f8-824a-583871b70123\") " pod="openshift-ingress-canary/ingress-canary-88hqp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232800 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232818 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-config\") pod \"route-controller-manager-6576b87f9c-9m5db\" (UID: \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232834 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9281391-bb2a-40e2-ba91-bb6892bd888f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gxn2x\" (UID: \"f9281391-bb2a-40e2-ba91-bb6892bd888f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232874 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-mountpoint-dir\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232890 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157-auth-proxy-config\") pod \"machine-approver-56656f9798-mlb7b\" (UID: \"ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232916 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c7522dc2-2021-4ca3-8ece-f051f1149e61-image-import-ca\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232932 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4b11032c-d124-4cd4-a810-90b94f3755cc-signing-cabundle\") pod \"service-ca-9c57cc56f-ljfs4\" (UID: \"4b11032c-d124-4cd4-a810-90b94f3755cc\") " pod="openshift-service-ca/service-ca-9c57cc56f-ljfs4" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232950 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/543641a4-6d2d-437f-93a5-478579e0622f-etcd-client\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232965 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/cbbd545d-5fde-4209-84e8-9252737745c4-tmpfs\") pod \"packageserver-d55dfcdfc-g68tl\" (UID: \"cbbd545d-5fde-4209-84e8-9252737745c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232991 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-bound-sa-token\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.233007 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/934f89dc-ec50-4944-8f3d-e2a06cc98ebb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xt7sw\" (UID: \"934f89dc-ec50-4944-8f3d-e2a06cc98ebb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.233021 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbbd545d-5fde-4209-84e8-9252737745c4-webhook-cert\") pod \"packageserver-d55dfcdfc-g68tl\" (UID: \"cbbd545d-5fde-4209-84e8-9252737745c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.233045 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-registry-certificates\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.233059 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/593183b0-1bb3-42d1-8949-d1b56f0ac114-profile-collector-cert\") pod \"olm-operator-6b444d44fb-55h6m\" (UID: \"593183b0-1bb3-42d1-8949-d1b56f0ac114\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.233073 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j4cr\" (UniqueName: \"kubernetes.io/projected/934f89dc-ec50-4944-8f3d-e2a06cc98ebb-kube-api-access-7j4cr\") pod \"kube-storage-version-migrator-operator-b67b599dd-xt7sw\" (UID: \"934f89dc-ec50-4944-8f3d-e2a06cc98ebb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.233087 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543641a4-6d2d-437f-93a5-478579e0622f-config\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.233101 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqhkv\" (UniqueName: \"kubernetes.io/projected/9c24edcb-aeef-44a1-99b6-9e7904c41253-kube-api-access-xqhkv\") pod \"router-default-5444994796-wb9bj\" (UID: \"9c24edcb-aeef-44a1-99b6-9e7904c41253\") " pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.233110 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7kn7h\" (UID: \"92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.233116 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e762bee-f41c-4af8-96ac-543b92f1f983-config\") pod \"authentication-operator-69f744f599-pn4qw\" (UID: \"6e762bee-f41c-4af8-96ac-543b92f1f983\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.233133 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fjks\" (UniqueName: \"kubernetes.io/projected/6e762bee-f41c-4af8-96ac-543b92f1f983-kube-api-access-9fjks\") pod \"authentication-operator-69f744f599-pn4qw\" (UID: \"6e762bee-f41c-4af8-96ac-543b92f1f983\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.233156 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.234014 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c7522dc2-2021-4ca3-8ece-f051f1149e61-image-import-ca\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.235325 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c7522dc2-2021-4ca3-8ece-f051f1149e61-node-pullsecrets\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.235725 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-audit-policies\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.235767 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e337b91-adb8-4cb7-8e5e-be2b80e78f56-config\") pod \"machine-api-operator-5694c8668f-m54sp\" (UID: \"0e337b91-adb8-4cb7-8e5e-be2b80e78f56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.235796 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.235826 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3101bbf9-84d2-42a0-a530-516f06015a0c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wxdg7\" (UID: \"3101bbf9-84d2-42a0-a530-516f06015a0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.235842 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-csi-data-dir\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.235868 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f01828fd-fba3-487c-a2c6-f5599e1c379d-secret-volume\") pod \"collect-profiles-29497500-ljz8g\" (UID: \"f01828fd-fba3-487c-a2c6-f5599e1c379d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.235886 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph8ck\" (UniqueName: \"kubernetes.io/projected/752260dc-3c62-401d-90a9-3eb60ad4b8fa-kube-api-access-ph8ck\") pod \"catalog-operator-68c6474976-774bh\" (UID: \"752260dc-3c62-401d-90a9-3eb60ad4b8fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.235902 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdnr4\" (UniqueName: \"kubernetes.io/projected/f01828fd-fba3-487c-a2c6-f5599e1c379d-kube-api-access-tdnr4\") pod \"collect-profiles-29497500-ljz8g\" (UID: \"f01828fd-fba3-487c-a2c6-f5599e1c379d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.235918 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/543641a4-6d2d-437f-93a5-478579e0622f-etcd-service-ca\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.235934 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.235948 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-registration-dir\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.235973 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.235989 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/593183b0-1bb3-42d1-8949-d1b56f0ac114-srv-cert\") pod \"olm-operator-6b444d44fb-55h6m\" (UID: \"593183b0-1bb3-42d1-8949-d1b56f0ac114\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236017 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236031 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/006ff35e-de43-4a64-bae9-f57f78f8d389-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k7m9b\" (UID: \"006ff35e-de43-4a64-bae9-f57f78f8d389\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236052 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9cz2\" (UniqueName: \"kubernetes.io/projected/543641a4-6d2d-437f-93a5-478579e0622f-kube-api-access-d9cz2\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236072 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e762bee-f41c-4af8-96ac-543b92f1f983-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pn4qw\" (UID: \"6e762bee-f41c-4af8-96ac-543b92f1f983\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236088 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0e337b91-adb8-4cb7-8e5e-be2b80e78f56-images\") pod \"machine-api-operator-5694c8668f-m54sp\" (UID: \"0e337b91-adb8-4cb7-8e5e-be2b80e78f56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236104 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6j67\" (UniqueName: \"kubernetes.io/projected/0e337b91-adb8-4cb7-8e5e-be2b80e78f56-kube-api-access-l6j67\") pod \"machine-api-operator-5694c8668f-m54sp\" (UID: \"0e337b91-adb8-4cb7-8e5e-be2b80e78f56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236119 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbbd545d-5fde-4209-84e8-9252737745c4-apiservice-cert\") pod \"packageserver-d55dfcdfc-g68tl\" (UID: \"cbbd545d-5fde-4209-84e8-9252737745c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236143 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsnl6\" (UniqueName: \"kubernetes.io/projected/94e21399-452d-4820-8683-6536189c56c0-kube-api-access-qsnl6\") pod \"ingress-operator-5b745b69d9-4n7fz\" (UID: \"94e21399-452d-4820-8683-6536189c56c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236176 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-socket-dir\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236190 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2e303d-86eb-4b59-bfde-b8bfccb3ae65-config\") pod \"kube-controller-manager-operator-78b949d7b-lnzfn\" (UID: \"6e2e303d-86eb-4b59-bfde-b8bfccb3ae65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236207 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/543641a4-6d2d-437f-93a5-478579e0622f-etcd-ca\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236274 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-config\") pod \"route-controller-manager-6576b87f9c-9m5db\" (UID: \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.235855 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7522dc2-2021-4ca3-8ece-f051f1149e61-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.235970 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9281391-bb2a-40e2-ba91-bb6892bd888f-config\") pod \"kube-apiserver-operator-766d6c64bb-gxn2x\" (UID: \"f9281391-bb2a-40e2-ba91-bb6892bd888f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236185 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-client-ca\") pod \"route-controller-manager-6576b87f9c-9m5db\" (UID: \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236380 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c7522dc2-2021-4ca3-8ece-f051f1149e61-audit\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.232682 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236646 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236678 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c24edcb-aeef-44a1-99b6-9e7904c41253-service-ca-bundle\") pod \"router-default-5444994796-wb9bj\" (UID: \"9c24edcb-aeef-44a1-99b6-9e7904c41253\") " pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236696 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c24edcb-aeef-44a1-99b6-9e7904c41253-metrics-certs\") pod \"router-default-5444994796-wb9bj\" (UID: \"9c24edcb-aeef-44a1-99b6-9e7904c41253\") " pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236700 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c7522dc2-2021-4ca3-8ece-f051f1149e61-audit-dir\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236718 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sdf2\" (UniqueName: \"kubernetes.io/projected/f100d6ab-c3b2-4712-b2d3-370287baadb4-kube-api-access-4sdf2\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236735 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnhpc\" (UniqueName: \"kubernetes.io/projected/579c1ccd-dc05-4543-9f3e-9e82915896dc-kube-api-access-bnhpc\") pod \"dns-operator-744455d44c-mv84q\" (UID: \"579c1ccd-dc05-4543-9f3e-9e82915896dc\") " pod="openshift-dns-operator/dns-operator-744455d44c-mv84q" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236751 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9343e0fc-ae90-490e-9f5a-eb1668c75226-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6pvst\" (UID: \"9343e0fc-ae90-490e-9f5a-eb1668c75226\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236781 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4dzb\" (UniqueName: \"kubernetes.io/projected/3101bbf9-84d2-42a0-a530-516f06015a0c-kube-api-access-c4dzb\") pod \"cluster-image-registry-operator-dc59b4c8b-wxdg7\" (UID: \"3101bbf9-84d2-42a0-a530-516f06015a0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236800 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9281391-bb2a-40e2-ba91-bb6892bd888f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gxn2x\" (UID: \"f9281391-bb2a-40e2-ba91-bb6892bd888f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236819 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc2rx\" (UniqueName: \"kubernetes.io/projected/9343e0fc-ae90-490e-9f5a-eb1668c75226-kube-api-access-hc2rx\") pod \"machine-config-controller-84d6567774-6pvst\" (UID: \"9343e0fc-ae90-490e-9f5a-eb1668c75226\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236834 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8jcd\" (UniqueName: \"kubernetes.io/projected/4ded8f26-8d73-4a0e-a976-b3b4169a0c04-kube-api-access-w8jcd\") pod \"service-ca-operator-777779d784-w5fjp\" (UID: \"4ded8f26-8d73-4a0e-a976-b3b4169a0c04\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236852 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/934f89dc-ec50-4944-8f3d-e2a06cc98ebb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xt7sw\" (UID: \"934f89dc-ec50-4944-8f3d-e2a06cc98ebb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236874 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c7522dc2-2021-4ca3-8ece-f051f1149e61-etcd-serving-ca\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.236951 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a9087aa4-944f-458e-9dfc-0d2e1ee1246e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wqsnf\" (UID: \"a9087aa4-944f-458e-9dfc-0d2e1ee1246e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.237396 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.237499 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c7522dc2-2021-4ca3-8ece-f051f1149e61-etcd-client\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.238004 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0e337b91-adb8-4cb7-8e5e-be2b80e78f56-images\") pod \"machine-api-operator-5694c8668f-m54sp\" (UID: \"0e337b91-adb8-4cb7-8e5e-be2b80e78f56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.238181 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157-auth-proxy-config\") pod \"machine-approver-56656f9798-mlb7b\" (UID: \"ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.238406 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-trusted-ca\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.238786 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e762bee-f41c-4af8-96ac-543b92f1f983-config\") pod \"authentication-operator-69f744f599-pn4qw\" (UID: \"6e762bee-f41c-4af8-96ac-543b92f1f983\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.239154 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-registry-certificates\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: E0131 09:06:48.239423 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:48.739405076 +0000 UTC m=+119.408088543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.239420 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e762bee-f41c-4af8-96ac-543b92f1f983-service-ca-bundle\") pod \"authentication-operator-69f744f599-pn4qw\" (UID: \"6e762bee-f41c-4af8-96ac-543b92f1f983\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.239458 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.239709 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e337b91-adb8-4cb7-8e5e-be2b80e78f56-config\") pod \"machine-api-operator-5694c8668f-m54sp\" (UID: \"0e337b91-adb8-4cb7-8e5e-be2b80e78f56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.239713 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157-config\") pod \"machine-approver-56656f9798-mlb7b\" (UID: \"ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.239729 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f100d6ab-c3b2-4712-b2d3-370287baadb4-audit-dir\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.240388 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.240484 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3101bbf9-84d2-42a0-a530-516f06015a0c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wxdg7\" (UID: \"3101bbf9-84d2-42a0-a530-516f06015a0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.240672 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-registry-tls\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.241903 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.242650 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.242903 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7kn7h\" (UID: \"92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.243207 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7522dc2-2021-4ca3-8ece-f051f1149e61-serving-cert\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.243567 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9087aa4-944f-458e-9dfc-0d2e1ee1246e-serving-cert\") pod \"openshift-config-operator-7777fb866f-wqsnf\" (UID: \"a9087aa4-944f-458e-9dfc-0d2e1ee1246e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.243680 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.243811 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.243905 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157-machine-approver-tls\") pod \"machine-approver-56656f9798-mlb7b\" (UID: \"ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.244205 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e762bee-f41c-4af8-96ac-543b92f1f983-serving-cert\") pod \"authentication-operator-69f744f599-pn4qw\" (UID: \"6e762bee-f41c-4af8-96ac-543b92f1f983\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.244394 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c7522dc2-2021-4ca3-8ece-f051f1149e61-encryption-config\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.244580 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3101bbf9-84d2-42a0-a530-516f06015a0c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wxdg7\" (UID: \"3101bbf9-84d2-42a0-a530-516f06015a0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.244608 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.245127 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e337b91-adb8-4cb7-8e5e-be2b80e78f56-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-m54sp\" (UID: \"0e337b91-adb8-4cb7-8e5e-be2b80e78f56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.245146 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.245377 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.245550 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9281391-bb2a-40e2-ba91-bb6892bd888f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gxn2x\" (UID: \"f9281391-bb2a-40e2-ba91-bb6892bd888f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.245627 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.245764 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-serving-cert\") pod \"route-controller-manager-6576b87f9c-9m5db\" (UID: \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.273205 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnblf\" (UniqueName: \"kubernetes.io/projected/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-kube-api-access-mnblf\") pod \"route-controller-manager-6576b87f9c-9m5db\" (UID: \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.292511 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfs2w\" (UniqueName: \"kubernetes.io/projected/c7522dc2-2021-4ca3-8ece-f051f1149e61-kube-api-access-gfs2w\") pod \"apiserver-76f77b778f-7h6ff\" (UID: \"c7522dc2-2021-4ca3-8ece-f051f1149e61\") " pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.312152 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fjks\" (UniqueName: \"kubernetes.io/projected/6e762bee-f41c-4af8-96ac-543b92f1f983-kube-api-access-9fjks\") pod \"authentication-operator-69f744f599-pn4qw\" (UID: \"6e762bee-f41c-4af8-96ac-543b92f1f983\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.315019 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.332064 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-bound-sa-token\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338002 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:48 crc kubenswrapper[4783]: E0131 09:06:48.338130 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:48.838085867 +0000 UTC m=+119.506769334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338218 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/cbbd545d-5fde-4209-84e8-9252737745c4-tmpfs\") pod \"packageserver-d55dfcdfc-g68tl\" (UID: \"cbbd545d-5fde-4209-84e8-9252737745c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338244 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/934f89dc-ec50-4944-8f3d-e2a06cc98ebb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xt7sw\" (UID: \"934f89dc-ec50-4944-8f3d-e2a06cc98ebb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338260 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbbd545d-5fde-4209-84e8-9252737745c4-webhook-cert\") pod \"packageserver-d55dfcdfc-g68tl\" (UID: \"cbbd545d-5fde-4209-84e8-9252737745c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338285 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j4cr\" (UniqueName: \"kubernetes.io/projected/934f89dc-ec50-4944-8f3d-e2a06cc98ebb-kube-api-access-7j4cr\") pod \"kube-storage-version-migrator-operator-b67b599dd-xt7sw\" (UID: \"934f89dc-ec50-4944-8f3d-e2a06cc98ebb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338301 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543641a4-6d2d-437f-93a5-478579e0622f-config\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338316 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/593183b0-1bb3-42d1-8949-d1b56f0ac114-profile-collector-cert\") pod \"olm-operator-6b444d44fb-55h6m\" (UID: \"593183b0-1bb3-42d1-8949-d1b56f0ac114\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338331 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqhkv\" (UniqueName: \"kubernetes.io/projected/9c24edcb-aeef-44a1-99b6-9e7904c41253-kube-api-access-xqhkv\") pod \"router-default-5444994796-wb9bj\" (UID: \"9c24edcb-aeef-44a1-99b6-9e7904c41253\") " pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338348 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f01828fd-fba3-487c-a2c6-f5599e1c379d-secret-volume\") pod \"collect-profiles-29497500-ljz8g\" (UID: \"f01828fd-fba3-487c-a2c6-f5599e1c379d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338365 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-csi-data-dir\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338379 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-registration-dir\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338393 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph8ck\" (UniqueName: \"kubernetes.io/projected/752260dc-3c62-401d-90a9-3eb60ad4b8fa-kube-api-access-ph8ck\") pod \"catalog-operator-68c6474976-774bh\" (UID: \"752260dc-3c62-401d-90a9-3eb60ad4b8fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338407 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdnr4\" (UniqueName: \"kubernetes.io/projected/f01828fd-fba3-487c-a2c6-f5599e1c379d-kube-api-access-tdnr4\") pod \"collect-profiles-29497500-ljz8g\" (UID: \"f01828fd-fba3-487c-a2c6-f5599e1c379d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338422 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/543641a4-6d2d-437f-93a5-478579e0622f-etcd-service-ca\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338455 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/593183b0-1bb3-42d1-8949-d1b56f0ac114-srv-cert\") pod \"olm-operator-6b444d44fb-55h6m\" (UID: \"593183b0-1bb3-42d1-8949-d1b56f0ac114\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338472 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/006ff35e-de43-4a64-bae9-f57f78f8d389-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k7m9b\" (UID: \"006ff35e-de43-4a64-bae9-f57f78f8d389\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338489 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9cz2\" (UniqueName: \"kubernetes.io/projected/543641a4-6d2d-437f-93a5-478579e0622f-kube-api-access-d9cz2\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338500 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-csi-data-dir\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.339059 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543641a4-6d2d-437f-93a5-478579e0622f-config\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.338507 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbbd545d-5fde-4209-84e8-9252737745c4-apiservice-cert\") pod \"packageserver-d55dfcdfc-g68tl\" (UID: \"cbbd545d-5fde-4209-84e8-9252737745c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.339190 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsnl6\" (UniqueName: \"kubernetes.io/projected/94e21399-452d-4820-8683-6536189c56c0-kube-api-access-qsnl6\") pod \"ingress-operator-5b745b69d9-4n7fz\" (UID: \"94e21399-452d-4820-8683-6536189c56c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.339216 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-socket-dir\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.339233 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2e303d-86eb-4b59-bfde-b8bfccb3ae65-config\") pod \"kube-controller-manager-operator-78b949d7b-lnzfn\" (UID: \"6e2e303d-86eb-4b59-bfde-b8bfccb3ae65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.339265 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/543641a4-6d2d-437f-93a5-478579e0622f-etcd-ca\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.339281 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c24edcb-aeef-44a1-99b6-9e7904c41253-service-ca-bundle\") pod \"router-default-5444994796-wb9bj\" (UID: \"9c24edcb-aeef-44a1-99b6-9e7904c41253\") " pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.339296 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c24edcb-aeef-44a1-99b6-9e7904c41253-metrics-certs\") pod \"router-default-5444994796-wb9bj\" (UID: \"9c24edcb-aeef-44a1-99b6-9e7904c41253\") " pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.339332 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnhpc\" (UniqueName: \"kubernetes.io/projected/579c1ccd-dc05-4543-9f3e-9e82915896dc-kube-api-access-bnhpc\") pod \"dns-operator-744455d44c-mv84q\" (UID: \"579c1ccd-dc05-4543-9f3e-9e82915896dc\") " pod="openshift-dns-operator/dns-operator-744455d44c-mv84q" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.339356 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9343e0fc-ae90-490e-9f5a-eb1668c75226-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6pvst\" (UID: \"9343e0fc-ae90-490e-9f5a-eb1668c75226\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.339544 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/cbbd545d-5fde-4209-84e8-9252737745c4-tmpfs\") pod \"packageserver-d55dfcdfc-g68tl\" (UID: \"cbbd545d-5fde-4209-84e8-9252737745c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.339635 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-socket-dir\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.339696 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/543641a4-6d2d-437f-93a5-478579e0622f-etcd-service-ca\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.339740 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/006ff35e-de43-4a64-bae9-f57f78f8d389-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k7m9b\" (UID: \"006ff35e-de43-4a64-bae9-f57f78f8d389\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.339281 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-registration-dir\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340182 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c24edcb-aeef-44a1-99b6-9e7904c41253-service-ca-bundle\") pod \"router-default-5444994796-wb9bj\" (UID: \"9c24edcb-aeef-44a1-99b6-9e7904c41253\") " pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340219 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc2rx\" (UniqueName: \"kubernetes.io/projected/9343e0fc-ae90-490e-9f5a-eb1668c75226-kube-api-access-hc2rx\") pod \"machine-config-controller-84d6567774-6pvst\" (UID: \"9343e0fc-ae90-490e-9f5a-eb1668c75226\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340227 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/543641a4-6d2d-437f-93a5-478579e0622f-etcd-ca\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340240 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8jcd\" (UniqueName: \"kubernetes.io/projected/4ded8f26-8d73-4a0e-a976-b3b4169a0c04-kube-api-access-w8jcd\") pod \"service-ca-operator-777779d784-w5fjp\" (UID: \"4ded8f26-8d73-4a0e-a976-b3b4169a0c04\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340253 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2e303d-86eb-4b59-bfde-b8bfccb3ae65-config\") pod \"kube-controller-manager-operator-78b949d7b-lnzfn\" (UID: \"6e2e303d-86eb-4b59-bfde-b8bfccb3ae65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340270 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/934f89dc-ec50-4944-8f3d-e2a06cc98ebb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xt7sw\" (UID: \"934f89dc-ec50-4944-8f3d-e2a06cc98ebb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340290 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee769697-49da-45f8-824a-583871b70123-cert\") pod \"ingress-canary-88hqp\" (UID: \"ee769697-49da-45f8-824a-583871b70123\") " pod="openshift-ingress-canary/ingress-canary-88hqp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340528 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9343e0fc-ae90-490e-9f5a-eb1668c75226-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6pvst\" (UID: \"9343e0fc-ae90-490e-9f5a-eb1668c75226\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340656 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85df2\" (UniqueName: \"kubernetes.io/projected/a3f564d8-5f07-446d-9dd1-955e39d4a5f4-kube-api-access-85df2\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4qqm\" (UID: \"a3f564d8-5f07-446d-9dd1-955e39d4a5f4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4qqm" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340556 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/593183b0-1bb3-42d1-8949-d1b56f0ac114-profile-collector-cert\") pod \"olm-operator-6b444d44fb-55h6m\" (UID: \"593183b0-1bb3-42d1-8949-d1b56f0ac114\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340676 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtvfk\" (UniqueName: \"kubernetes.io/projected/a57e8066-1fe2-4664-9415-fa8a8b6621c5-kube-api-access-mtvfk\") pod \"dns-default-sm9jn\" (UID: \"a57e8066-1fe2-4664-9415-fa8a8b6621c5\") " pod="openshift-dns/dns-default-sm9jn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340719 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f-images\") pod \"machine-config-operator-74547568cd-gf2tf\" (UID: \"06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340743 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7cf20dc6-2184-41b1-a943-f917dafb36b4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lkj9z\" (UID: \"7cf20dc6-2184-41b1-a943-f917dafb36b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340760 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e2e303d-86eb-4b59-bfde-b8bfccb3ae65-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lnzfn\" (UID: \"6e2e303d-86eb-4b59-bfde-b8bfccb3ae65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340778 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e2e303d-86eb-4b59-bfde-b8bfccb3ae65-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lnzfn\" (UID: \"6e2e303d-86eb-4b59-bfde-b8bfccb3ae65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340796 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/006ff35e-de43-4a64-bae9-f57f78f8d389-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k7m9b\" (UID: \"006ff35e-de43-4a64-bae9-f57f78f8d389\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340800 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f01828fd-fba3-487c-a2c6-f5599e1c379d-secret-volume\") pod \"collect-profiles-29497500-ljz8g\" (UID: \"f01828fd-fba3-487c-a2c6-f5599e1c379d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340816 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-plugins-dir\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340759 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/934f89dc-ec50-4944-8f3d-e2a06cc98ebb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xt7sw\" (UID: \"934f89dc-ec50-4944-8f3d-e2a06cc98ebb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340833 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gf2tf\" (UID: \"06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340852 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a57e8066-1fe2-4664-9415-fa8a8b6621c5-metrics-tls\") pod \"dns-default-sm9jn\" (UID: \"a57e8066-1fe2-4664-9415-fa8a8b6621c5\") " pod="openshift-dns/dns-default-sm9jn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340880 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ed58079c-95e2-4931-a94e-ec9e1a65bbc0-node-bootstrap-token\") pod \"machine-config-server-5kcff\" (UID: \"ed58079c-95e2-4931-a94e-ec9e1a65bbc0\") " pod="openshift-machine-config-operator/machine-config-server-5kcff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340903 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk99d\" (UniqueName: \"kubernetes.io/projected/593183b0-1bb3-42d1-8949-d1b56f0ac114-kube-api-access-kk99d\") pod \"olm-operator-6b444d44fb-55h6m\" (UID: \"593183b0-1bb3-42d1-8949-d1b56f0ac114\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340921 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a57e8066-1fe2-4664-9415-fa8a8b6621c5-config-volume\") pod \"dns-default-sm9jn\" (UID: \"a57e8066-1fe2-4664-9415-fa8a8b6621c5\") " pod="openshift-dns/dns-default-sm9jn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340953 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66lpq\" (UniqueName: \"kubernetes.io/projected/ed58079c-95e2-4931-a94e-ec9e1a65bbc0-kube-api-access-66lpq\") pod \"machine-config-server-5kcff\" (UID: \"ed58079c-95e2-4931-a94e-ec9e1a65bbc0\") " pod="openshift-machine-config-operator/machine-config-server-5kcff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340975 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/40a6c5ba-0bc2-4e23-b92b-8486e77001ae-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-n9kcz\" (UID: \"40a6c5ba-0bc2-4e23-b92b-8486e77001ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n9kcz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.340992 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/006ff35e-de43-4a64-bae9-f57f78f8d389-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k7m9b\" (UID: \"006ff35e-de43-4a64-bae9-f57f78f8d389\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341007 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/543641a4-6d2d-437f-93a5-478579e0622f-serving-cert\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341022 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7cf20dc6-2184-41b1-a943-f917dafb36b4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lkj9z\" (UID: \"7cf20dc6-2184-41b1-a943-f917dafb36b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341036 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/579c1ccd-dc05-4543-9f3e-9e82915896dc-metrics-tls\") pod \"dns-operator-744455d44c-mv84q\" (UID: \"579c1ccd-dc05-4543-9f3e-9e82915896dc\") " pod="openshift-dns-operator/dns-operator-744455d44c-mv84q" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341052 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9c24edcb-aeef-44a1-99b6-9e7904c41253-stats-auth\") pod \"router-default-5444994796-wb9bj\" (UID: \"9c24edcb-aeef-44a1-99b6-9e7904c41253\") " pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341066 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/44c20261-8b8d-4fe3-9ae7-a07a46eafac8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hcbdl\" (UID: \"44c20261-8b8d-4fe3-9ae7-a07a46eafac8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341091 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4b11032c-d124-4cd4-a810-90b94f3755cc-signing-key\") pod \"service-ca-9c57cc56f-ljfs4\" (UID: \"4b11032c-d124-4cd4-a810-90b94f3755cc\") " pod="openshift-service-ca/service-ca-9c57cc56f-ljfs4" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341106 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f01828fd-fba3-487c-a2c6-f5599e1c379d-config-volume\") pod \"collect-profiles-29497500-ljz8g\" (UID: \"f01828fd-fba3-487c-a2c6-f5599e1c379d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341126 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9c9g\" (UniqueName: \"kubernetes.io/projected/40a6c5ba-0bc2-4e23-b92b-8486e77001ae-kube-api-access-m9c9g\") pod \"multus-admission-controller-857f4d67dd-n9kcz\" (UID: \"40a6c5ba-0bc2-4e23-b92b-8486e77001ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n9kcz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341140 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g84sk\" (UniqueName: \"kubernetes.io/projected/44c20261-8b8d-4fe3-9ae7-a07a46eafac8-kube-api-access-g84sk\") pod \"package-server-manager-789f6589d5-hcbdl\" (UID: \"44c20261-8b8d-4fe3-9ae7-a07a46eafac8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341183 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqwhx\" (UniqueName: \"kubernetes.io/projected/06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f-kube-api-access-xqwhx\") pod \"machine-config-operator-74547568cd-gf2tf\" (UID: \"06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341244 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjqxb\" (UniqueName: \"kubernetes.io/projected/cbbd545d-5fde-4209-84e8-9252737745c4-kube-api-access-gjqxb\") pod \"packageserver-d55dfcdfc-g68tl\" (UID: \"cbbd545d-5fde-4209-84e8-9252737745c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341260 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9c24edcb-aeef-44a1-99b6-9e7904c41253-default-certificate\") pod \"router-default-5444994796-wb9bj\" (UID: \"9c24edcb-aeef-44a1-99b6-9e7904c41253\") " pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341275 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xngp\" (UniqueName: \"kubernetes.io/projected/4b11032c-d124-4cd4-a810-90b94f3755cc-kube-api-access-4xngp\") pod \"service-ca-9c57cc56f-ljfs4\" (UID: \"4b11032c-d124-4cd4-a810-90b94f3755cc\") " pod="openshift-service-ca/service-ca-9c57cc56f-ljfs4" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341303 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/752260dc-3c62-401d-90a9-3eb60ad4b8fa-srv-cert\") pod \"catalog-operator-68c6474976-774bh\" (UID: \"752260dc-3c62-401d-90a9-3eb60ad4b8fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341322 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3f564d8-5f07-446d-9dd1-955e39d4a5f4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4qqm\" (UID: \"a3f564d8-5f07-446d-9dd1-955e39d4a5f4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4qqm" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341342 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94e21399-452d-4820-8683-6536189c56c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4n7fz\" (UID: \"94e21399-452d-4820-8683-6536189c56c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341357 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9343e0fc-ae90-490e-9f5a-eb1668c75226-proxy-tls\") pod \"machine-config-controller-84d6567774-6pvst\" (UID: \"9343e0fc-ae90-490e-9f5a-eb1668c75226\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341373 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m8ns\" (UniqueName: \"kubernetes.io/projected/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-kube-api-access-5m8ns\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341386 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/752260dc-3c62-401d-90a9-3eb60ad4b8fa-profile-collector-cert\") pod \"catalog-operator-68c6474976-774bh\" (UID: \"752260dc-3c62-401d-90a9-3eb60ad4b8fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341402 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94e21399-452d-4820-8683-6536189c56c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-4n7fz\" (UID: \"94e21399-452d-4820-8683-6536189c56c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341445 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/934f89dc-ec50-4944-8f3d-e2a06cc98ebb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xt7sw\" (UID: \"934f89dc-ec50-4944-8f3d-e2a06cc98ebb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341497 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f-images\") pod \"machine-config-operator-74547568cd-gf2tf\" (UID: \"06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341566 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-plugins-dir\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341804 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbbd545d-5fde-4209-84e8-9252737745c4-webhook-cert\") pod \"packageserver-d55dfcdfc-g68tl\" (UID: \"cbbd545d-5fde-4209-84e8-9252737745c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.341953 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gf2tf\" (UID: \"06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.342092 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94e21399-452d-4820-8683-6536189c56c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-4n7fz\" (UID: \"94e21399-452d-4820-8683-6536189c56c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.342116 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ded8f26-8d73-4a0e-a976-b3b4169a0c04-serving-cert\") pod \"service-ca-operator-777779d784-w5fjp\" (UID: \"4ded8f26-8d73-4a0e-a976-b3b4169a0c04\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.342141 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ed58079c-95e2-4931-a94e-ec9e1a65bbc0-certs\") pod \"machine-config-server-5kcff\" (UID: \"ed58079c-95e2-4931-a94e-ec9e1a65bbc0\") " pod="openshift-machine-config-operator/machine-config-server-5kcff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.342172 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ded8f26-8d73-4a0e-a976-b3b4169a0c04-config\") pod \"service-ca-operator-777779d784-w5fjp\" (UID: \"4ded8f26-8d73-4a0e-a976-b3b4169a0c04\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.342190 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4std2\" (UniqueName: \"kubernetes.io/projected/7cf20dc6-2184-41b1-a943-f917dafb36b4-kube-api-access-4std2\") pod \"marketplace-operator-79b997595-lkj9z\" (UID: \"7cf20dc6-2184-41b1-a943-f917dafb36b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.342205 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjn74\" (UniqueName: \"kubernetes.io/projected/4c28e17f-e541-4f08-88cf-0bd130b756cc-kube-api-access-sjn74\") pod \"migrator-59844c95c7-lljb6\" (UID: \"4c28e17f-e541-4f08-88cf-0bd130b756cc\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lljb6" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.342224 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f-proxy-tls\") pod \"machine-config-operator-74547568cd-gf2tf\" (UID: \"06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.342244 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k42nt\" (UniqueName: \"kubernetes.io/projected/ee769697-49da-45f8-824a-583871b70123-kube-api-access-k42nt\") pod \"ingress-canary-88hqp\" (UID: \"ee769697-49da-45f8-824a-583871b70123\") " pod="openshift-ingress-canary/ingress-canary-88hqp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.342270 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.342308 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-mountpoint-dir\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.342323 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4b11032c-d124-4cd4-a810-90b94f3755cc-signing-cabundle\") pod \"service-ca-9c57cc56f-ljfs4\" (UID: \"4b11032c-d124-4cd4-a810-90b94f3755cc\") " pod="openshift-service-ca/service-ca-9c57cc56f-ljfs4" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.342348 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/543641a4-6d2d-437f-93a5-478579e0622f-etcd-client\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.342363 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbbd545d-5fde-4209-84e8-9252737745c4-apiservice-cert\") pod \"packageserver-d55dfcdfc-g68tl\" (UID: \"cbbd545d-5fde-4209-84e8-9252737745c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.342742 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f01828fd-fba3-487c-a2c6-f5599e1c379d-config-volume\") pod \"collect-profiles-29497500-ljz8g\" (UID: \"f01828fd-fba3-487c-a2c6-f5599e1c379d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.343106 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/593183b0-1bb3-42d1-8949-d1b56f0ac114-srv-cert\") pod \"olm-operator-6b444d44fb-55h6m\" (UID: \"593183b0-1bb3-42d1-8949-d1b56f0ac114\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.343414 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ded8f26-8d73-4a0e-a976-b3b4169a0c04-config\") pod \"service-ca-operator-777779d784-w5fjp\" (UID: \"4ded8f26-8d73-4a0e-a976-b3b4169a0c04\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.343553 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9c24edcb-aeef-44a1-99b6-9e7904c41253-default-certificate\") pod \"router-default-5444994796-wb9bj\" (UID: \"9c24edcb-aeef-44a1-99b6-9e7904c41253\") " pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:48 crc kubenswrapper[4783]: E0131 09:06:48.343747 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:48.843730734 +0000 UTC m=+119.512414201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.343804 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4b11032c-d124-4cd4-a810-90b94f3755cc-signing-cabundle\") pod \"service-ca-9c57cc56f-ljfs4\" (UID: \"4b11032c-d124-4cd4-a810-90b94f3755cc\") " pod="openshift-service-ca/service-ca-9c57cc56f-ljfs4" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.344350 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7cf20dc6-2184-41b1-a943-f917dafb36b4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lkj9z\" (UID: \"7cf20dc6-2184-41b1-a943-f917dafb36b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.345288 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a57e8066-1fe2-4664-9415-fa8a8b6621c5-config-volume\") pod \"dns-default-sm9jn\" (UID: \"a57e8066-1fe2-4664-9415-fa8a8b6621c5\") " pod="openshift-dns/dns-default-sm9jn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.345293 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7cf20dc6-2184-41b1-a943-f917dafb36b4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lkj9z\" (UID: \"7cf20dc6-2184-41b1-a943-f917dafb36b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.345818 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e2e303d-86eb-4b59-bfde-b8bfccb3ae65-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lnzfn\" (UID: \"6e2e303d-86eb-4b59-bfde-b8bfccb3ae65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.345996 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a57e8066-1fe2-4664-9415-fa8a8b6621c5-metrics-tls\") pod \"dns-default-sm9jn\" (UID: \"a57e8066-1fe2-4664-9415-fa8a8b6621c5\") " pod="openshift-dns/dns-default-sm9jn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.346259 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-mountpoint-dir\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.346658 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94e21399-452d-4820-8683-6536189c56c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-4n7fz\" (UID: \"94e21399-452d-4820-8683-6536189c56c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.346889 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c24edcb-aeef-44a1-99b6-9e7904c41253-metrics-certs\") pod \"router-default-5444994796-wb9bj\" (UID: \"9c24edcb-aeef-44a1-99b6-9e7904c41253\") " pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.347280 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ded8f26-8d73-4a0e-a976-b3b4169a0c04-serving-cert\") pod \"service-ca-operator-777779d784-w5fjp\" (UID: \"4ded8f26-8d73-4a0e-a976-b3b4169a0c04\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.347419 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/752260dc-3c62-401d-90a9-3eb60ad4b8fa-srv-cert\") pod \"catalog-operator-68c6474976-774bh\" (UID: \"752260dc-3c62-401d-90a9-3eb60ad4b8fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.348076 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/752260dc-3c62-401d-90a9-3eb60ad4b8fa-profile-collector-cert\") pod \"catalog-operator-68c6474976-774bh\" (UID: \"752260dc-3c62-401d-90a9-3eb60ad4b8fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.348477 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/543641a4-6d2d-437f-93a5-478579e0622f-serving-cert\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.348524 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9343e0fc-ae90-490e-9f5a-eb1668c75226-proxy-tls\") pod \"machine-config-controller-84d6567774-6pvst\" (UID: \"9343e0fc-ae90-490e-9f5a-eb1668c75226\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.348965 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ed58079c-95e2-4931-a94e-ec9e1a65bbc0-node-bootstrap-token\") pod \"machine-config-server-5kcff\" (UID: \"ed58079c-95e2-4931-a94e-ec9e1a65bbc0\") " pod="openshift-machine-config-operator/machine-config-server-5kcff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.349010 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4b11032c-d124-4cd4-a810-90b94f3755cc-signing-key\") pod \"service-ca-9c57cc56f-ljfs4\" (UID: \"4b11032c-d124-4cd4-a810-90b94f3755cc\") " pod="openshift-service-ca/service-ca-9c57cc56f-ljfs4" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.349025 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ed58079c-95e2-4931-a94e-ec9e1a65bbc0-certs\") pod \"machine-config-server-5kcff\" (UID: \"ed58079c-95e2-4931-a94e-ec9e1a65bbc0\") " pod="openshift-machine-config-operator/machine-config-server-5kcff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.349132 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9c24edcb-aeef-44a1-99b6-9e7904c41253-stats-auth\") pod \"router-default-5444994796-wb9bj\" (UID: \"9c24edcb-aeef-44a1-99b6-9e7904c41253\") " pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.349279 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/006ff35e-de43-4a64-bae9-f57f78f8d389-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k7m9b\" (UID: \"006ff35e-de43-4a64-bae9-f57f78f8d389\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.349416 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/579c1ccd-dc05-4543-9f3e-9e82915896dc-metrics-tls\") pod \"dns-operator-744455d44c-mv84q\" (UID: \"579c1ccd-dc05-4543-9f3e-9e82915896dc\") " pod="openshift-dns-operator/dns-operator-744455d44c-mv84q" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.349587 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f-proxy-tls\") pod \"machine-config-operator-74547568cd-gf2tf\" (UID: \"06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.350101 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3f564d8-5f07-446d-9dd1-955e39d4a5f4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4qqm\" (UID: \"a3f564d8-5f07-446d-9dd1-955e39d4a5f4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4qqm" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.350422 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/44c20261-8b8d-4fe3-9ae7-a07a46eafac8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hcbdl\" (UID: \"44c20261-8b8d-4fe3-9ae7-a07a46eafac8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.350463 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/543641a4-6d2d-437f-93a5-478579e0622f-etcd-client\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.351100 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee769697-49da-45f8-824a-583871b70123-cert\") pod \"ingress-canary-88hqp\" (UID: \"ee769697-49da-45f8-824a-583871b70123\") " pod="openshift-ingress-canary/ingress-canary-88hqp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.351110 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94e21399-452d-4820-8683-6536189c56c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-4n7fz\" (UID: \"94e21399-452d-4820-8683-6536189c56c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.351255 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/40a6c5ba-0bc2-4e23-b92b-8486e77001ae-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-n9kcz\" (UID: \"40a6c5ba-0bc2-4e23-b92b-8486e77001ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n9kcz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.352927 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3101bbf9-84d2-42a0-a530-516f06015a0c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wxdg7\" (UID: \"3101bbf9-84d2-42a0-a530-516f06015a0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.371145 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-xjwbp"] Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.374377 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9281391-bb2a-40e2-ba91-bb6892bd888f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gxn2x\" (UID: \"f9281391-bb2a-40e2-ba91-bb6892bd888f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x" Jan 31 09:06:48 crc kubenswrapper[4783]: W0131 09:06:48.379459 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b0dabea_ff61_4a2e_9f7b_70e1d3e2c266.slice/crio-1caf0a8be7b8acab161bc4bf4ccbafaf43204c73585d52a9f1f1a9487ee8dd8b WatchSource:0}: Error finding container 1caf0a8be7b8acab161bc4bf4ccbafaf43204c73585d52a9f1f1a9487ee8dd8b: Status 404 returned error can't find the container with id 1caf0a8be7b8acab161bc4bf4ccbafaf43204c73585d52a9f1f1a9487ee8dd8b Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.393333 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj98b\" (UniqueName: \"kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-kube-api-access-gj98b\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.395340 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.405210 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9chh8"] Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.405814 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6lht4"] Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.413212 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfffq\" (UniqueName: \"kubernetes.io/projected/a9087aa4-944f-458e-9dfc-0d2e1ee1246e-kube-api-access-zfffq\") pod \"openshift-config-operator-7777fb866f-wqsnf\" (UID: \"a9087aa4-944f-458e-9dfc-0d2e1ee1246e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.436632 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6j67\" (UniqueName: \"kubernetes.io/projected/0e337b91-adb8-4cb7-8e5e-be2b80e78f56-kube-api-access-l6j67\") pod \"machine-api-operator-5694c8668f-m54sp\" (UID: \"0e337b91-adb8-4cb7-8e5e-be2b80e78f56\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.441237 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pn4qw"] Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.446942 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:48 crc kubenswrapper[4783]: E0131 09:06:48.447400 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:48.947381265 +0000 UTC m=+119.616064733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.450449 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc"] Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.455023 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd"] Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.455024 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sdf2\" (UniqueName: \"kubernetes.io/projected/f100d6ab-c3b2-4712-b2d3-370287baadb4-kube-api-access-4sdf2\") pod \"oauth-openshift-558db77b4-7csp2\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.476575 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4dzb\" (UniqueName: \"kubernetes.io/projected/3101bbf9-84d2-42a0-a530-516f06015a0c-kube-api-access-c4dzb\") pod \"cluster-image-registry-operator-dc59b4c8b-wxdg7\" (UID: \"3101bbf9-84d2-42a0-a530-516f06015a0c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.493524 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x6n4\" (UniqueName: \"kubernetes.io/projected/92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1-kube-api-access-9x6n4\") pod \"openshift-controller-manager-operator-756b6f6bc6-7kn7h\" (UID: \"92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.513700 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkc67\" (UniqueName: \"kubernetes.io/projected/ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157-kube-api-access-qkc67\") pod \"machine-approver-56656f9798-mlb7b\" (UID: \"ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.539268 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x"] Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.543531 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.549948 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.550479 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: E0131 09:06:48.550858 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:49.050842539 +0000 UTC m=+119.719526007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:48 crc kubenswrapper[4783]: W0131 09:06:48.551283 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9281391_bb2a_40e2_ba91_bb6892bd888f.slice/crio-0f5169a03ee60ca99b104f94eaf01a9bf9f7cd4f3baeae1a483f73211278411c WatchSource:0}: Error finding container 0f5169a03ee60ca99b104f94eaf01a9bf9f7cd4f3baeae1a483f73211278411c: Status 404 returned error can't find the container with id 0f5169a03ee60ca99b104f94eaf01a9bf9f7cd4f3baeae1a483f73211278411c Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.556026 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j4cr\" (UniqueName: \"kubernetes.io/projected/934f89dc-ec50-4944-8f3d-e2a06cc98ebb-kube-api-access-7j4cr\") pod \"kube-storage-version-migrator-operator-b67b599dd-xt7sw\" (UID: \"934f89dc-ec50-4944-8f3d-e2a06cc98ebb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.573425 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqhkv\" (UniqueName: \"kubernetes.io/projected/9c24edcb-aeef-44a1-99b6-9e7904c41253-kube-api-access-xqhkv\") pod \"router-default-5444994796-wb9bj\" (UID: \"9c24edcb-aeef-44a1-99b6-9e7904c41253\") " pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.580058 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.594529 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph8ck\" (UniqueName: \"kubernetes.io/projected/752260dc-3c62-401d-90a9-3eb60ad4b8fa-kube-api-access-ph8ck\") pod \"catalog-operator-68c6474976-774bh\" (UID: \"752260dc-3c62-401d-90a9-3eb60ad4b8fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.608870 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.613905 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdnr4\" (UniqueName: \"kubernetes.io/projected/f01828fd-fba3-487c-a2c6-f5599e1c379d-kube-api-access-tdnr4\") pod \"collect-profiles-29497500-ljz8g\" (UID: \"f01828fd-fba3-487c-a2c6-f5599e1c379d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.626105 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.636411 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9cz2\" (UniqueName: \"kubernetes.io/projected/543641a4-6d2d-437f-93a5-478579e0622f-kube-api-access-d9cz2\") pod \"etcd-operator-b45778765-x8w2f\" (UID: \"543641a4-6d2d-437f-93a5-478579e0622f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.651944 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:48 crc kubenswrapper[4783]: E0131 09:06:48.652322 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:49.152303866 +0000 UTC m=+119.820987333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.662527 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsnl6\" (UniqueName: \"kubernetes.io/projected/94e21399-452d-4820-8683-6536189c56c0-kube-api-access-qsnl6\") pod \"ingress-operator-5b745b69d9-4n7fz\" (UID: \"94e21399-452d-4820-8683-6536189c56c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.662743 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.668732 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.674154 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnhpc\" (UniqueName: \"kubernetes.io/projected/579c1ccd-dc05-4543-9f3e-9e82915896dc-kube-api-access-bnhpc\") pod \"dns-operator-744455d44c-mv84q\" (UID: \"579c1ccd-dc05-4543-9f3e-9e82915896dc\") " pod="openshift-dns-operator/dns-operator-744455d44c-mv84q" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.691824 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.697757 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc2rx\" (UniqueName: \"kubernetes.io/projected/9343e0fc-ae90-490e-9f5a-eb1668c75226-kube-api-access-hc2rx\") pod \"machine-config-controller-84d6567774-6pvst\" (UID: \"9343e0fc-ae90-490e-9f5a-eb1668c75226\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.708693 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7h6ff"] Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.718700 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8jcd\" (UniqueName: \"kubernetes.io/projected/4ded8f26-8d73-4a0e-a976-b3b4169a0c04-kube-api-access-w8jcd\") pod \"service-ca-operator-777779d784-w5fjp\" (UID: \"4ded8f26-8d73-4a0e-a976-b3b4169a0c04\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.721133 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db"] Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.732401 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mv84q" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.739116 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.741151 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85df2\" (UniqueName: \"kubernetes.io/projected/a3f564d8-5f07-446d-9dd1-955e39d4a5f4-kube-api-access-85df2\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4qqm\" (UID: \"a3f564d8-5f07-446d-9dd1-955e39d4a5f4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4qqm" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.745504 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.752829 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: E0131 09:06:48.753111 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:49.253098109 +0000 UTC m=+119.921781577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.756662 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.760091 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtvfk\" (UniqueName: \"kubernetes.io/projected/a57e8066-1fe2-4664-9415-fa8a8b6621c5-kube-api-access-mtvfk\") pod \"dns-default-sm9jn\" (UID: \"a57e8066-1fe2-4664-9415-fa8a8b6621c5\") " pod="openshift-dns/dns-default-sm9jn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.771956 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.795319 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sm9jn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.829327 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.832984 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.840825 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e2e303d-86eb-4b59-bfde-b8bfccb3ae65-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lnzfn\" (UID: \"6e2e303d-86eb-4b59-bfde-b8bfccb3ae65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.842063 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk99d\" (UniqueName: \"kubernetes.io/projected/593183b0-1bb3-42d1-8949-d1b56f0ac114-kube-api-access-kk99d\") pod \"olm-operator-6b444d44fb-55h6m\" (UID: \"593183b0-1bb3-42d1-8949-d1b56f0ac114\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.848113 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/006ff35e-de43-4a64-bae9-f57f78f8d389-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k7m9b\" (UID: \"006ff35e-de43-4a64-bae9-f57f78f8d389\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.853244 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:48 crc kubenswrapper[4783]: E0131 09:06:48.853699 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:49.353681866 +0000 UTC m=+120.022365334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.862647 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9c9g\" (UniqueName: \"kubernetes.io/projected/40a6c5ba-0bc2-4e23-b92b-8486e77001ae-kube-api-access-m9c9g\") pod \"multus-admission-controller-857f4d67dd-n9kcz\" (UID: \"40a6c5ba-0bc2-4e23-b92b-8486e77001ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n9kcz" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.874840 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g84sk\" (UniqueName: \"kubernetes.io/projected/44c20261-8b8d-4fe3-9ae7-a07a46eafac8-kube-api-access-g84sk\") pod \"package-server-manager-789f6589d5-hcbdl\" (UID: \"44c20261-8b8d-4fe3-9ae7-a07a46eafac8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.881639 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqwhx\" (UniqueName: \"kubernetes.io/projected/06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f-kube-api-access-xqwhx\") pod \"machine-config-operator-74547568cd-gf2tf\" (UID: \"06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.906038 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjqxb\" (UniqueName: \"kubernetes.io/projected/cbbd545d-5fde-4209-84e8-9252737745c4-kube-api-access-gjqxb\") pod \"packageserver-d55dfcdfc-g68tl\" (UID: \"cbbd545d-5fde-4209-84e8-9252737745c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.921750 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k42nt\" (UniqueName: \"kubernetes.io/projected/ee769697-49da-45f8-824a-583871b70123-kube-api-access-k42nt\") pod \"ingress-canary-88hqp\" (UID: \"ee769697-49da-45f8-824a-583871b70123\") " pod="openshift-ingress-canary/ingress-canary-88hqp" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.930669 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-m54sp"] Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.947916 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjn74\" (UniqueName: \"kubernetes.io/projected/4c28e17f-e541-4f08-88cf-0bd130b756cc-kube-api-access-sjn74\") pod \"migrator-59844c95c7-lljb6\" (UID: \"4c28e17f-e541-4f08-88cf-0bd130b756cc\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lljb6" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.954782 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:48 crc kubenswrapper[4783]: E0131 09:06:48.955073 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:49.455061587 +0000 UTC m=+120.123745056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.959155 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4std2\" (UniqueName: \"kubernetes.io/projected/7cf20dc6-2184-41b1-a943-f917dafb36b4-kube-api-access-4std2\") pod \"marketplace-operator-79b997595-lkj9z\" (UID: \"7cf20dc6-2184-41b1-a943-f917dafb36b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.976617 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66lpq\" (UniqueName: \"kubernetes.io/projected/ed58079c-95e2-4931-a94e-ec9e1a65bbc0-kube-api-access-66lpq\") pod \"machine-config-server-5kcff\" (UID: \"ed58079c-95e2-4931-a94e-ec9e1a65bbc0\") " pod="openshift-machine-config-operator/machine-config-server-5kcff" Jan 31 09:06:48 crc kubenswrapper[4783]: I0131 09:06:48.986989 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7csp2"] Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:48.999803 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.008563 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xngp\" (UniqueName: \"kubernetes.io/projected/4b11032c-d124-4cd4-a810-90b94f3755cc-kube-api-access-4xngp\") pod \"service-ca-9c57cc56f-ljfs4\" (UID: \"4b11032c-d124-4cd4-a810-90b94f3755cc\") " pod="openshift-service-ca/service-ca-9c57cc56f-ljfs4" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.016823 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94e21399-452d-4820-8683-6536189c56c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4n7fz\" (UID: \"94e21399-452d-4820-8683-6536189c56c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.025980 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4qqm" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.045388 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m8ns\" (UniqueName: \"kubernetes.io/projected/c2ff120c-221e-4dcb-a51f-9c53c3fe25c7-kube-api-access-5m8ns\") pod \"csi-hostpathplugin-brrd5\" (UID: \"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7\") " pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.049859 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" event={"ID":"c7522dc2-2021-4ca3-8ece-f051f1149e61","Type":"ContainerStarted","Data":"3b65fd0453f1fbf8c1fd548ef4b92d76ddac8301f382e9fe2f5202506ce14aeb"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.050636 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" event={"ID":"0e337b91-adb8-4cb7-8e5e-be2b80e78f56","Type":"ContainerStarted","Data":"2bc8435cbb6557b3126d0bdd93b9ec283b8783048912959574c34213cc5e665c"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.051672 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6lht4" event={"ID":"700365a9-e7e5-413b-a980-42b9abcd61c7","Type":"ContainerStarted","Data":"7baf44e70ac8bd1faa5cfc0fadb657429009ba892733542a112fb65c00ea6920"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.051694 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6lht4" event={"ID":"700365a9-e7e5-413b-a980-42b9abcd61c7","Type":"ContainerStarted","Data":"151e8647ae442c9a12f99dae72e07eebc481a26cc229e44485d034f859170ff8"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.053386 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-6lht4" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.053796 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.055383 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:49 crc kubenswrapper[4783]: E0131 09:06:49.055982 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:49.555968446 +0000 UTC m=+120.224651913 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.057844 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" event={"ID":"175323b1-b0a0-4811-a2c7-4c98ee3a5b56","Type":"ContainerStarted","Data":"dae54553c9ef94b565eb5a57a9d5d22ef453c5b971e381a70d401ebfcb44b90d"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.070237 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.070237 4783 patch_prober.go:28] interesting pod/console-operator-58897d9998-6lht4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.070576 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6lht4" podUID="700365a9-e7e5-413b-a980-42b9abcd61c7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.074818 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x" event={"ID":"f9281391-bb2a-40e2-ba91-bb6892bd888f","Type":"ContainerStarted","Data":"940f237402be717ea9b7f0ccd797c095e17218636edaebee301559311ba20c97"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.074845 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x" event={"ID":"f9281391-bb2a-40e2-ba91-bb6892bd888f","Type":"ContainerStarted","Data":"0f5169a03ee60ca99b104f94eaf01a9bf9f7cd4f3baeae1a483f73211278411c"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.076642 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.079503 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" event={"ID":"6e762bee-f41c-4af8-96ac-543b92f1f983","Type":"ContainerStarted","Data":"f5a9ddbd61cd9da5ad9f140b8a5adcdd21f6510b85218f0db83b434e771cd05d"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.079528 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" event={"ID":"6e762bee-f41c-4af8-96ac-543b92f1f983","Type":"ContainerStarted","Data":"ecfe8968f9466a88804aa316f132e4b1ce2f4f6f35976b398a4012a6237886d4"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.096001 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc" event={"ID":"6e091acc-49d9-4782-b82a-d71e6f276dce","Type":"ContainerStarted","Data":"6977ee5119abed99ac031d2866c1fb87fcff6f3ffb39fe254ea72eccd568434b"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.098574 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc" event={"ID":"6e091acc-49d9-4782-b82a-d71e6f276dce","Type":"ContainerStarted","Data":"adf1dd3c868624594dd1322ea4d7ed561121d056509a9345c2538703608f1e88"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.098827 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.099632 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.103431 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.108362 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-n9kcz" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.121413 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" event={"ID":"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d","Type":"ContainerStarted","Data":"4210581ab6a01075fcdd091aaec800ede9b181e1e6a5eefe255e34258002feb6"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.121450 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" event={"ID":"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d","Type":"ContainerStarted","Data":"7338079f256bc10b53448708c2288e787324889d66922e9afc308f56c3ac2406"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.121612 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.130690 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.133839 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xjwbp" event={"ID":"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266","Type":"ContainerStarted","Data":"0fc98378525172db507b9435f65f533bed519862359ceea4df7777836875b495"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.133891 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xjwbp" event={"ID":"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266","Type":"ContainerStarted","Data":"1caf0a8be7b8acab161bc4bf4ccbafaf43204c73585d52a9f1f1a9487ee8dd8b"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.137582 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.141953 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.142130 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9chh8" event={"ID":"62344451-2a07-4504-833b-de06393277f2","Type":"ContainerStarted","Data":"d7959bb1f50d420be990e48685c4cc65f3fa0a08e3b8c2b64867ae065d40530f"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.142187 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9chh8" event={"ID":"62344451-2a07-4504-833b-de06393277f2","Type":"ContainerStarted","Data":"460340c4d448c476c4abe773a6384c753475656a40a5d1cdb9a2e2a38c638af8"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.142363 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-9chh8" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.143961 4783 generic.go:334] "Generic (PLEG): container finished" podID="0aa3bf42-302f-4ad1-9a65-d2c878e957a6" containerID="97dfc9b6802456cbb12eeb6fb23927ca1f7ceb8b17b831afc03be98bf0a0f6fa" exitCode=0 Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.144005 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" event={"ID":"0aa3bf42-302f-4ad1-9a65-d2c878e957a6","Type":"ContainerDied","Data":"97dfc9b6802456cbb12eeb6fb23927ca1f7ceb8b17b831afc03be98bf0a0f6fa"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.144025 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" event={"ID":"0aa3bf42-302f-4ad1-9a65-d2c878e957a6","Type":"ContainerStarted","Data":"a4b54b4028220675bc47c6c4936867055a2aa5af0d06c2d5a03734440eb4223b"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.150146 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lljb6" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.154604 4783 patch_prober.go:28] interesting pod/downloads-7954f5f757-9chh8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.154663 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9chh8" podUID="62344451-2a07-4504-833b-de06393277f2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.154919 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-88hqp" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.156798 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fl86q" event={"ID":"63366f9c-1f0c-4f9c-ae86-3298cb4274f9","Type":"ContainerStarted","Data":"f43f164842d46422c24fceb3bf6db28359864523bdda4370f598d7dee4197231"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.156828 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fl86q" event={"ID":"63366f9c-1f0c-4f9c-ae86-3298cb4274f9","Type":"ContainerStarted","Data":"90a24f4fdb88101bbcb403fb8881fc7d6ea3b92db91c8f9bef3035f7be10c95e"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.156839 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fl86q" event={"ID":"63366f9c-1f0c-4f9c-ae86-3298cb4274f9","Type":"ContainerStarted","Data":"5f97d12e8b87b5389d561b0c39c52862a73843ea3a38d15a9cb15614f70cf727"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.157093 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:49 crc kubenswrapper[4783]: E0131 09:06:49.158229 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:49.658220184 +0000 UTC m=+120.326903652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.159331 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5kcff" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.162931 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" event={"ID":"ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157","Type":"ContainerStarted","Data":"fda9bf0a9c42e995e9df8c0d575284ba6ef783f5e51cfa6f91ae1311c999f641"} Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.178079 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-brrd5" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.262315 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:49 crc kubenswrapper[4783]: E0131 09:06:49.262405 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:49.762384276 +0000 UTC m=+120.431067744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.263025 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:49 crc kubenswrapper[4783]: E0131 09:06:49.267765 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:49.767751399 +0000 UTC m=+120.436434867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.311768 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf"] Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.313703 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ljfs4" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.313815 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mv84q"] Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.365257 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:49 crc kubenswrapper[4783]: E0131 09:06:49.366945 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:49.866897889 +0000 UTC m=+120.535581358 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.468604 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:49 crc kubenswrapper[4783]: E0131 09:06:49.468841 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:49.968827009 +0000 UTC m=+120.637510477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.537729 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h"] Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.538595 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7"] Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.570803 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:49 crc kubenswrapper[4783]: E0131 09:06:49.572684 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:50.07265664 +0000 UTC m=+120.741340107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.676668 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:49 crc kubenswrapper[4783]: E0131 09:06:49.677285 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:50.177271404 +0000 UTC m=+120.845954872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:49 crc kubenswrapper[4783]: W0131 09:06:49.758855 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod934f89dc_ec50_4944_8f3d_e2a06cc98ebb.slice/crio-27474720a7df34b7b9ae9a1664b584f4b5e51c4eb0f1b23186cb26cd23471745 WatchSource:0}: Error finding container 27474720a7df34b7b9ae9a1664b584f4b5e51c4eb0f1b23186cb26cd23471745: Status 404 returned error can't find the container with id 27474720a7df34b7b9ae9a1664b584f4b5e51c4eb0f1b23186cb26cd23471745 Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.774298 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw"] Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.774328 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst"] Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.774345 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sm9jn"] Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.778777 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:49 crc kubenswrapper[4783]: E0131 09:06:49.779127 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:50.279107216 +0000 UTC m=+120.947790685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.858033 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fl86q" podStartSLOduration=97.858012238 podStartE2EDuration="1m37.858012238s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:49.816354911 +0000 UTC m=+120.485038379" watchObservedRunningTime="2026-01-31 09:06:49.858012238 +0000 UTC m=+120.526695707" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.879804 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:49 crc kubenswrapper[4783]: E0131 09:06:49.880125 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:50.380113352 +0000 UTC m=+121.048796820 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.970456 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gxn2x" podStartSLOduration=97.970439686 podStartE2EDuration="1m37.970439686s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:49.930492249 +0000 UTC m=+120.599175718" watchObservedRunningTime="2026-01-31 09:06:49.970439686 +0000 UTC m=+120.639123154" Jan 31 09:06:49 crc kubenswrapper[4783]: I0131 09:06:49.981459 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:49 crc kubenswrapper[4783]: E0131 09:06:49.982080 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:50.482044716 +0000 UTC m=+121.150728183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.012154 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-xjwbp" podStartSLOduration=98.012136798 podStartE2EDuration="1m38.012136798s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:50.011476069 +0000 UTC m=+120.680159538" watchObservedRunningTime="2026-01-31 09:06:50.012136798 +0000 UTC m=+120.680820267" Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.084004 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:50 crc kubenswrapper[4783]: E0131 09:06:50.084346 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:50.584335658 +0000 UTC m=+121.253019127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.123070 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-x8w2f"] Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.126230 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp"] Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.186600 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:50 crc kubenswrapper[4783]: E0131 09:06:50.186834 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:50.686789369 +0000 UTC m=+121.355472827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.240317 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-9chh8" podStartSLOduration=98.240299534 podStartE2EDuration="1m38.240299534s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:50.238565747 +0000 UTC m=+120.907249216" watchObservedRunningTime="2026-01-31 09:06:50.240299534 +0000 UTC m=+120.908983002" Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.276207 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh"] Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.296152 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:50 crc kubenswrapper[4783]: E0131 09:06:50.296610 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:50.796593619 +0000 UTC m=+121.465277086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.315930 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g"] Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.318256 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-pn4qw" podStartSLOduration=98.318233371 podStartE2EDuration="1m38.318233371s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:50.271422415 +0000 UTC m=+120.940105882" watchObservedRunningTime="2026-01-31 09:06:50.318233371 +0000 UTC m=+120.986916839" Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.341565 4783 generic.go:334] "Generic (PLEG): container finished" podID="c7522dc2-2021-4ca3-8ece-f051f1149e61" containerID="75c4541f94ee34681adb1fa211f1fb4fcda6dfc44c59faec4abbe8b2cf7e3e55" exitCode=0 Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.341657 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" event={"ID":"c7522dc2-2021-4ca3-8ece-f051f1149e61","Type":"ContainerDied","Data":"75c4541f94ee34681adb1fa211f1fb4fcda6dfc44c59faec4abbe8b2cf7e3e55"} Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.388472 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst" event={"ID":"9343e0fc-ae90-490e-9f5a-eb1668c75226","Type":"ContainerStarted","Data":"0c05bc36a3dfbced55376e81af966f4c42ea4667757567746ec6f90e4d7ce56f"} Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.400986 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:50 crc kubenswrapper[4783]: E0131 09:06:50.401362 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:50.90134462 +0000 UTC m=+121.570028089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.421599 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" event={"ID":"ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157","Type":"ContainerStarted","Data":"5f551b9e48165406ff84d71317616110864c77867ae7558838ec9255ec77e5e8"} Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.422731 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" event={"ID":"f100d6ab-c3b2-4712-b2d3-370287baadb4","Type":"ContainerStarted","Data":"611c376cdf6854afd499618d547c89b888bf4c401bd54406ea49d33cfaf2857b"} Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.424407 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sm9jn" event={"ID":"a57e8066-1fe2-4664-9415-fa8a8b6621c5","Type":"ContainerStarted","Data":"1623d399f9135b99e358c410f316fc55512b23b2c28d0debf7d549f93cdc727f"} Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.425325 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h" event={"ID":"92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1","Type":"ContainerStarted","Data":"59408c3e1d65eebe3bd242d5a7249b5cd3827cd539ea328ee85a6c5ef7d6ae14"} Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.427840 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" event={"ID":"3101bbf9-84d2-42a0-a530-516f06015a0c","Type":"ContainerStarted","Data":"50708124ee95bc01a371802f51c814ddb0296b90574e8ec99d4e5a2d949fbd45"} Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.430446 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" event={"ID":"175323b1-b0a0-4811-a2c7-4c98ee3a5b56","Type":"ContainerStarted","Data":"af44c122098f051b22603517b178c92898c2c932ef94e112efcff6ba583811fb"} Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.430802 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.449578 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5kcff" event={"ID":"ed58079c-95e2-4931-a94e-ec9e1a65bbc0","Type":"ContainerStarted","Data":"75d54d4fa1c050ce6b23a8d1699863dbe05f40d986cda868893e4c0bb282a11a"} Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.462001 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" event={"ID":"0e337b91-adb8-4cb7-8e5e-be2b80e78f56","Type":"ContainerStarted","Data":"e2300e8ad7486bc390042936e28b0be4136df66dfd49555a511d6016e57a74a3"} Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.473047 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mv84q" event={"ID":"579c1ccd-dc05-4543-9f3e-9e82915896dc","Type":"ContainerStarted","Data":"62a7a37787ccd918e3604d0ec18aa103d22c6848e0a85fb8a6475b3b2be95fe9"} Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.478335 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" event={"ID":"a9087aa4-944f-458e-9dfc-0d2e1ee1246e","Type":"ContainerStarted","Data":"cb89ba01125a25521b1f42f4be14c635879aa4098f1ac0984480dd5a539d884e"} Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.479320 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw" event={"ID":"934f89dc-ec50-4944-8f3d-e2a06cc98ebb","Type":"ContainerStarted","Data":"27474720a7df34b7b9ae9a1664b584f4b5e51c4eb0f1b23186cb26cd23471745"} Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.482902 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wb9bj" event={"ID":"9c24edcb-aeef-44a1-99b6-9e7904c41253","Type":"ContainerStarted","Data":"57ba27eaeab8d781a04c8415ee7c64dcded4988ade65b7b40a654282c8d0ad2d"} Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.483349 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wb9bj" event={"ID":"9c24edcb-aeef-44a1-99b6-9e7904c41253","Type":"ContainerStarted","Data":"70bbc633b354cbd51ffe4e9c8045bf87909de123286be28f73b374e762dddae8"} Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.485305 4783 patch_prober.go:28] interesting pod/downloads-7954f5f757-9chh8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.485363 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9chh8" podUID="62344451-2a07-4504-833b-de06393277f2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.497928 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6nkc" podStartSLOduration=98.497910976 podStartE2EDuration="1m38.497910976s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:50.497361057 +0000 UTC m=+121.166044525" watchObservedRunningTime="2026-01-31 09:06:50.497910976 +0000 UTC m=+121.166594444" Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.501851 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:50 crc kubenswrapper[4783]: E0131 09:06:50.503483 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:51.003467967 +0000 UTC m=+121.672151435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.573829 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" podStartSLOduration=98.573811191 podStartE2EDuration="1m38.573811191s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:50.572512556 +0000 UTC m=+121.241196024" watchObservedRunningTime="2026-01-31 09:06:50.573811191 +0000 UTC m=+121.242494659" Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.602404 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:50 crc kubenswrapper[4783]: E0131 09:06:50.603747 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:51.103727571 +0000 UTC m=+121.772411040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.687536 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-6lht4" Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.704136 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:50 crc kubenswrapper[4783]: E0131 09:06:50.705029 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:51.205013175 +0000 UTC m=+121.873696643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.705030 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.792891 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn"] Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.806673 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:50 crc kubenswrapper[4783]: E0131 09:06:50.806957 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:51.306942004 +0000 UTC m=+121.975625472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.809721 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b"] Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.828547 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf"] Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.833388 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.843124 4783 patch_prober.go:28] interesting pod/router-default-5444994796-wb9bj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:06:50 crc kubenswrapper[4783]: [-]has-synced failed: reason withheld Jan 31 09:06:50 crc kubenswrapper[4783]: [+]process-running ok Jan 31 09:06:50 crc kubenswrapper[4783]: healthz check failed Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.843244 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wb9bj" podUID="9c24edcb-aeef-44a1-99b6-9e7904c41253" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.860896 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-6lht4" podStartSLOduration=98.86088173 podStartE2EDuration="1m38.86088173s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:50.859131593 +0000 UTC m=+121.527815062" watchObservedRunningTime="2026-01-31 09:06:50.86088173 +0000 UTC m=+121.529565188" Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.862070 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4qqm"] Jan 31 09:06:50 crc kubenswrapper[4783]: W0131 09:06:50.878504 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod006ff35e_de43_4a64_bae9_f57f78f8d389.slice/crio-4daea23d00fe74fe5c4daad79adf38c0568f8a15502610af43a21e539ddbe27c WatchSource:0}: Error finding container 4daea23d00fe74fe5c4daad79adf38c0568f8a15502610af43a21e539ddbe27c: Status 404 returned error can't find the container with id 4daea23d00fe74fe5c4daad79adf38c0568f8a15502610af43a21e539ddbe27c Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.892097 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-wb9bj" podStartSLOduration=98.892077018 podStartE2EDuration="1m38.892077018s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:50.891841904 +0000 UTC m=+121.560525372" watchObservedRunningTime="2026-01-31 09:06:50.892077018 +0000 UTC m=+121.560760486" Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.896149 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m"] Jan 31 09:06:50 crc kubenswrapper[4783]: I0131 09:06:50.908423 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:50 crc kubenswrapper[4783]: E0131 09:06:50.908848 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:51.408835055 +0000 UTC m=+122.077518522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.015582 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:51 crc kubenswrapper[4783]: E0131 09:06:51.016181 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:51.516151634 +0000 UTC m=+122.184835102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.018835 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lkj9z"] Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.025503 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-n9kcz"] Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.025584 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-88hqp"] Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.038532 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-brrd5"] Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.041212 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl"] Jan 31 09:06:51 crc kubenswrapper[4783]: W0131 09:06:51.057201 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbbd545d_5fde_4209_84e8_9252737745c4.slice/crio-c6c3061a0630cabae75edb0010ff25e344107361e499806ec577149c69204157 WatchSource:0}: Error finding container c6c3061a0630cabae75edb0010ff25e344107361e499806ec577149c69204157: Status 404 returned error can't find the container with id c6c3061a0630cabae75edb0010ff25e344107361e499806ec577149c69204157 Jan 31 09:06:51 crc kubenswrapper[4783]: W0131 09:06:51.077176 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40a6c5ba_0bc2_4e23_b92b_8486e77001ae.slice/crio-5a26e596f5208034fac0e712d3606c123f7e19796adf6f7086311c4eb94ddabb WatchSource:0}: Error finding container 5a26e596f5208034fac0e712d3606c123f7e19796adf6f7086311c4eb94ddabb: Status 404 returned error can't find the container with id 5a26e596f5208034fac0e712d3606c123f7e19796adf6f7086311c4eb94ddabb Jan 31 09:06:51 crc kubenswrapper[4783]: W0131 09:06:51.085261 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee769697_49da_45f8_824a_583871b70123.slice/crio-0b57f5e9aa25d6994bdd5d1a6abde3dee753e77ca132d73e4274def98f42a207 WatchSource:0}: Error finding container 0b57f5e9aa25d6994bdd5d1a6abde3dee753e77ca132d73e4274def98f42a207: Status 404 returned error can't find the container with id 0b57f5e9aa25d6994bdd5d1a6abde3dee753e77ca132d73e4274def98f42a207 Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.091199 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ljfs4"] Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.104456 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lljb6"] Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.114270 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl"] Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.118422 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:51 crc kubenswrapper[4783]: E0131 09:06:51.119459 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:51.619441935 +0000 UTC m=+122.288125403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.151624 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz"] Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.219978 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:51 crc kubenswrapper[4783]: E0131 09:06:51.220144 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:51.720120872 +0000 UTC m=+122.388804330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.220417 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:51 crc kubenswrapper[4783]: E0131 09:06:51.220799 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:51.720791119 +0000 UTC m=+122.389474587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:51 crc kubenswrapper[4783]: W0131 09:06:51.240743 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44c20261_8b8d_4fe3_9ae7_a07a46eafac8.slice/crio-d69bdc4f3564d719e8fb8b590f6e8e5dce7f653b8d12530024baf8d89ab73dee WatchSource:0}: Error finding container d69bdc4f3564d719e8fb8b590f6e8e5dce7f653b8d12530024baf8d89ab73dee: Status 404 returned error can't find the container with id d69bdc4f3564d719e8fb8b590f6e8e5dce7f653b8d12530024baf8d89ab73dee Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.321492 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:51 crc kubenswrapper[4783]: E0131 09:06:51.321983 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:51.82196433 +0000 UTC m=+122.490647798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.351594 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" podStartSLOduration=98.351579341 podStartE2EDuration="1m38.351579341s" podCreationTimestamp="2026-01-31 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:51.298107739 +0000 UTC m=+121.966791207" watchObservedRunningTime="2026-01-31 09:06:51.351579341 +0000 UTC m=+122.020262809" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.423321 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:51 crc kubenswrapper[4783]: E0131 09:06:51.431239 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:51.931216848 +0000 UTC m=+122.599900316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.517674 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mv84q" event={"ID":"579c1ccd-dc05-4543-9f3e-9e82915896dc","Type":"ContainerStarted","Data":"2be620a32377f80650e2eff58333d15c00a99c1256d05a11dac5ee1340953b59"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.524438 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-88hqp" event={"ID":"ee769697-49da-45f8-824a-583871b70123","Type":"ContainerStarted","Data":"0b57f5e9aa25d6994bdd5d1a6abde3dee753e77ca132d73e4274def98f42a207"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.524673 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:51 crc kubenswrapper[4783]: E0131 09:06:51.525088 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:52.025069064 +0000 UTC m=+122.693752532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.526825 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-brrd5" event={"ID":"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7","Type":"ContainerStarted","Data":"4a49022806ddb97caccd8355bbde5d4e0f5c68f484c5c265c69514cfdd58c124"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.533425 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" event={"ID":"593183b0-1bb3-42d1-8949-d1b56f0ac114","Type":"ContainerStarted","Data":"450a8784d3ff1aeb07756712a3e3b37b156be0290bb71c61d78a968183247a8f"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.534462 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.536767 4783 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-55h6m container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.536896 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" podUID="593183b0-1bb3-42d1-8949-d1b56f0ac114" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.545745 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp" event={"ID":"4ded8f26-8d73-4a0e-a976-b3b4169a0c04","Type":"ContainerStarted","Data":"5b63a3cf046ce2a97860bbd18e4dd6c4a1487dcbc2227aec92125d1ae290ad4f"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.545896 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp" event={"ID":"4ded8f26-8d73-4a0e-a976-b3b4169a0c04","Type":"ContainerStarted","Data":"19b80e26a4f3054a17c59f7e5e17154dfe53995d02daa201505e764b7155380f"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.550080 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl" event={"ID":"44c20261-8b8d-4fe3-9ae7-a07a46eafac8","Type":"ContainerStarted","Data":"d69bdc4f3564d719e8fb8b590f6e8e5dce7f653b8d12530024baf8d89ab73dee"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.555486 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" event={"ID":"0aa3bf42-302f-4ad1-9a65-d2c878e957a6","Type":"ContainerStarted","Data":"5917b5bf11a1939ef8105ad5d1630f3a2417671d272607c7f1e703e77cd6323c"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.561581 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4qqm" event={"ID":"a3f564d8-5f07-446d-9dd1-955e39d4a5f4","Type":"ContainerStarted","Data":"b83b5d727af480f543dfc08221865a7325ec221e4564b29de1118626582bc41b"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.561630 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4qqm" event={"ID":"a3f564d8-5f07-446d-9dd1-955e39d4a5f4","Type":"ContainerStarted","Data":"294e13dccd2f0af9f43d5a173916ddfa12d500539c747597bceaa4a0f2b3b428"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.562488 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" podStartSLOduration=98.562478285 podStartE2EDuration="1m38.562478285s" podCreationTimestamp="2026-01-31 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:51.559911293 +0000 UTC m=+122.228594761" watchObservedRunningTime="2026-01-31 09:06:51.562478285 +0000 UTC m=+122.231161752" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.567825 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" event={"ID":"06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f","Type":"ContainerStarted","Data":"c68eed8f08f36d551081ef34d25d76da0f8243321d6b3bff5e85aa0fb000c524"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.567854 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" event={"ID":"06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f","Type":"ContainerStarted","Data":"0c068d2003e03fb1495f20c718724e86f0a1d0f5ed5301831ea6a3e32b3f4794"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.577844 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" event={"ID":"94e21399-452d-4820-8683-6536189c56c0","Type":"ContainerStarted","Data":"d0dc34ebb2bd7571171c3f5d2ce888d6ef754768aa608e0a679e3740ac2883e6"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.579195 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5fjp" podStartSLOduration=98.579184833 podStartE2EDuration="1m38.579184833s" podCreationTimestamp="2026-01-31 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:51.577098762 +0000 UTC m=+122.245782230" watchObservedRunningTime="2026-01-31 09:06:51.579184833 +0000 UTC m=+122.247868302" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.582028 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw" event={"ID":"934f89dc-ec50-4944-8f3d-e2a06cc98ebb","Type":"ContainerStarted","Data":"d49be23adfb0153c361b0a92f67e09cbe4fbe431d8867296bd9a887e0633e84b"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.595526 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" podStartSLOduration=98.595507588 podStartE2EDuration="1m38.595507588s" podCreationTimestamp="2026-01-31 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:51.595107803 +0000 UTC m=+122.263791271" watchObservedRunningTime="2026-01-31 09:06:51.595507588 +0000 UTC m=+122.264191056" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.597423 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" event={"ID":"ed27e8cd-5c9d-43b3-8ea9-42e5c33d2157","Type":"ContainerStarted","Data":"cc437fe67addc6f8defce5ae9a063818f72d60b489fb187c4e28d2779e17d6f0"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.612077 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" event={"ID":"f100d6ab-c3b2-4712-b2d3-370287baadb4","Type":"ContainerStarted","Data":"9ddcdd900cc8dca3f8caa79f7b2985059e791b694da6ae4879f8e12188acefc6"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.612985 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.614784 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" event={"ID":"cbbd545d-5fde-4209-84e8-9252737745c4","Type":"ContainerStarted","Data":"c6c3061a0630cabae75edb0010ff25e344107361e499806ec577149c69204157"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.621079 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn" event={"ID":"6e2e303d-86eb-4b59-bfde-b8bfccb3ae65","Type":"ContainerStarted","Data":"77b51b931cf5d9f6a030917d1f9e27d149db4937bba14769b8b09a5768d64d4f"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.621320 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xt7sw" podStartSLOduration=99.621303973 podStartE2EDuration="1m39.621303973s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:51.6149469 +0000 UTC m=+122.283630369" watchObservedRunningTime="2026-01-31 09:06:51.621303973 +0000 UTC m=+122.289987442" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.627954 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:51 crc kubenswrapper[4783]: E0131 09:06:51.631268 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:52.131250018 +0000 UTC m=+122.799933486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.673129 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b" event={"ID":"006ff35e-de43-4a64-bae9-f57f78f8d389","Type":"ContainerStarted","Data":"ce01e2dcec95a9f7dec968253239c75d1a875662f806478e6ccb1e058da74613"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.673202 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.673214 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b" event={"ID":"006ff35e-de43-4a64-bae9-f57f78f8d389","Type":"ContainerStarted","Data":"4daea23d00fe74fe5c4daad79adf38c0568f8a15502610af43a21e539ddbe27c"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.681546 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4qqm" podStartSLOduration=98.68153582 podStartE2EDuration="1m38.68153582s" podCreationTimestamp="2026-01-31 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:51.644931211 +0000 UTC m=+122.313614678" watchObservedRunningTime="2026-01-31 09:06:51.68153582 +0000 UTC m=+122.350219288" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.683719 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn" podStartSLOduration=99.683714006 podStartE2EDuration="1m39.683714006s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:51.68097457 +0000 UTC m=+122.349658038" watchObservedRunningTime="2026-01-31 09:06:51.683714006 +0000 UTC m=+122.352397474" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.685820 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5kcff" event={"ID":"ed58079c-95e2-4931-a94e-ec9e1a65bbc0","Type":"ContainerStarted","Data":"d0b5aaf8750691d2a72282b5dc654940127bf6b2966ad5a273ca4844b5b72b35"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.699140 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" event={"ID":"543641a4-6d2d-437f-93a5-478579e0622f","Type":"ContainerStarted","Data":"a10270f273f8422c0a002246af78e6fcb4dae630174bd6e8117eb606b84c6f0a"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.699193 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" event={"ID":"543641a4-6d2d-437f-93a5-478579e0622f","Type":"ContainerStarted","Data":"e23ea41a02a3238b75fe13a388ba757fc33f3da57dc2f258aca55cc09f48efbb"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.703509 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mlb7b" podStartSLOduration=99.703502289 podStartE2EDuration="1m39.703502289s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:51.703038823 +0000 UTC m=+122.371722311" watchObservedRunningTime="2026-01-31 09:06:51.703502289 +0000 UTC m=+122.372185757" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.728679 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sm9jn" event={"ID":"a57e8066-1fe2-4664-9415-fa8a8b6621c5","Type":"ContainerStarted","Data":"70212e842f08ec32d61d8d8082bbbe64e45584d7faf09c6e3610b4ac5f57bff3"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.728985 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-sm9jn" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.729638 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:51 crc kubenswrapper[4783]: E0131 09:06:51.730320 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:52.230298814 +0000 UTC m=+122.898982283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.740275 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:51 crc kubenswrapper[4783]: E0131 09:06:51.741530 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:52.2415155 +0000 UTC m=+122.910198968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.760987 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" event={"ID":"7cf20dc6-2184-41b1-a943-f917dafb36b4","Type":"ContainerStarted","Data":"f82906df5be8f70c9f0319456b0a6ada2d3989c8fad437f7e9159f4ecedda1be"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.761026 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.763688 4783 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-lkj9z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.763731 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" podUID="7cf20dc6-2184-41b1-a943-f917dafb36b4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.766528 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" podStartSLOduration=99.766518046 podStartE2EDuration="1m39.766518046s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:51.736315343 +0000 UTC m=+122.404998812" watchObservedRunningTime="2026-01-31 09:06:51.766518046 +0000 UTC m=+122.435201514" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.767735 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-5kcff" podStartSLOduration=6.767728854 podStartE2EDuration="6.767728854s" podCreationTimestamp="2026-01-31 09:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:51.766140814 +0000 UTC m=+122.434824281" watchObservedRunningTime="2026-01-31 09:06:51.767728854 +0000 UTC m=+122.436412321" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.783976 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" event={"ID":"c7522dc2-2021-4ca3-8ece-f051f1149e61","Type":"ContainerStarted","Data":"91b39e86907ccfbc3b89969445657604775786d69633df5f015b8e99e6301543"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.808849 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lljb6" event={"ID":"4c28e17f-e541-4f08-88cf-0bd130b756cc","Type":"ContainerStarted","Data":"cd37ca233df1a3b3b41fc26b62ed943ec973fc62872b60d97bf05680d3f3853e"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.811309 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-x8w2f" podStartSLOduration=99.811299647 podStartE2EDuration="1m39.811299647s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:51.808930239 +0000 UTC m=+122.477613708" watchObservedRunningTime="2026-01-31 09:06:51.811299647 +0000 UTC m=+122.479983115" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.827947 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" event={"ID":"3101bbf9-84d2-42a0-a530-516f06015a0c","Type":"ContainerStarted","Data":"5f3b375d82b09adb14ea193e3529f3114c3536ef52554b92a79e38341adc1328"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.835206 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sm9jn" podStartSLOduration=6.835198376 podStartE2EDuration="6.835198376s" podCreationTimestamp="2026-01-31 09:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:51.83493532 +0000 UTC m=+122.503618778" watchObservedRunningTime="2026-01-31 09:06:51.835198376 +0000 UTC m=+122.503881845" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.844255 4783 patch_prober.go:28] interesting pod/router-default-5444994796-wb9bj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:06:51 crc kubenswrapper[4783]: [-]has-synced failed: reason withheld Jan 31 09:06:51 crc kubenswrapper[4783]: [+]process-running ok Jan 31 09:06:51 crc kubenswrapper[4783]: healthz check failed Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.844285 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wb9bj" podUID="9c24edcb-aeef-44a1-99b6-9e7904c41253" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.844704 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:51 crc kubenswrapper[4783]: E0131 09:06:51.845506 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:52.345494513 +0000 UTC m=+123.014177981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.870698 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst" event={"ID":"9343e0fc-ae90-490e-9f5a-eb1668c75226","Type":"ContainerStarted","Data":"15afccc474342c69081f89c293f22bcd2f143edb028beee5a31c48fd3d79e9cb"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.905237 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh" event={"ID":"752260dc-3c62-401d-90a9-3eb60ad4b8fa","Type":"ContainerStarted","Data":"9a7bda2d230210663c179efac723e5f380de662dbb9bbd9c213f834131c03d01"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.905271 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh" event={"ID":"752260dc-3c62-401d-90a9-3eb60ad4b8fa","Type":"ContainerStarted","Data":"0982a67a0fe8756188ee4710488c9d88b304aa58e1788896d680bde1c75d6c6a"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.905926 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.906970 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" event={"ID":"f01828fd-fba3-487c-a2c6-f5599e1c379d","Type":"ContainerStarted","Data":"4ade95b367c1495ae5ec997fab2d073aca17ec4cedaa325800e7e230d740ec37"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.906993 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" event={"ID":"f01828fd-fba3-487c-a2c6-f5599e1c379d","Type":"ContainerStarted","Data":"d4a398d6bb0e2b270902fb8b1cce54ee0f559638412338b6be0ad72c54209476"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.923409 4783 csr.go:261] certificate signing request csr-xxwj7 is approved, waiting to be issued Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.940654 4783 csr.go:257] certificate signing request csr-xxwj7 is issued Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.941872 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k7m9b" podStartSLOduration=99.941860289 podStartE2EDuration="1m39.941860289s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:51.940875618 +0000 UTC m=+122.609559087" watchObservedRunningTime="2026-01-31 09:06:51.941860289 +0000 UTC m=+122.610543757" Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.950868 4783 generic.go:334] "Generic (PLEG): container finished" podID="a9087aa4-944f-458e-9dfc-0d2e1ee1246e" containerID="84103f43681cfbf203ab1f3727d8ad666d0206fa059eaf0ea7ac00753e92fb17" exitCode=0 Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.950940 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" event={"ID":"a9087aa4-944f-458e-9dfc-0d2e1ee1246e","Type":"ContainerDied","Data":"84103f43681cfbf203ab1f3727d8ad666d0206fa059eaf0ea7ac00753e92fb17"} Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.961769 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:51 crc kubenswrapper[4783]: E0131 09:06:51.962911 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:52.462901279 +0000 UTC m=+123.131584748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:51 crc kubenswrapper[4783]: I0131 09:06:51.971449 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" podStartSLOduration=99.971439463 podStartE2EDuration="1m39.971439463s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:51.968438923 +0000 UTC m=+122.637122391" watchObservedRunningTime="2026-01-31 09:06:51.971439463 +0000 UTC m=+122.640122932" Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.002468 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh" Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.005931 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wxdg7" podStartSLOduration=100.005917214 podStartE2EDuration="1m40.005917214s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:52.005133042 +0000 UTC m=+122.673816511" watchObservedRunningTime="2026-01-31 09:06:52.005917214 +0000 UTC m=+122.674600682" Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.014980 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-n9kcz" event={"ID":"40a6c5ba-0bc2-4e23-b92b-8486e77001ae","Type":"ContainerStarted","Data":"5a26e596f5208034fac0e712d3606c123f7e19796adf6f7086311c4eb94ddabb"} Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.034854 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst" podStartSLOduration=99.034837994 podStartE2EDuration="1m39.034837994s" podCreationTimestamp="2026-01-31 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:52.03335987 +0000 UTC m=+122.702043338" watchObservedRunningTime="2026-01-31 09:06:52.034837994 +0000 UTC m=+122.703521461" Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.045313 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" event={"ID":"0e337b91-adb8-4cb7-8e5e-be2b80e78f56","Type":"ContainerStarted","Data":"ad6a12d00be0b39e8e42b40d23109d3ff479e257e7f85d8f383bdafc4bedb060"} Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.066131 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-774bh" podStartSLOduration=99.066115437 podStartE2EDuration="1m39.066115437s" podCreationTimestamp="2026-01-31 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:52.065911531 +0000 UTC m=+122.734595000" watchObservedRunningTime="2026-01-31 09:06:52.066115437 +0000 UTC m=+122.734798905" Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.066438 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:52 crc kubenswrapper[4783]: E0131 09:06:52.066573 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:52.566546431 +0000 UTC m=+123.235229899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.066830 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:52 crc kubenswrapper[4783]: E0131 09:06:52.067960 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:52.567949924 +0000 UTC m=+123.236633391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.076000 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ljfs4" event={"ID":"4b11032c-d124-4cd4-a810-90b94f3755cc","Type":"ContainerStarted","Data":"eabae18eed1086ad931f7e875e20215ad1744673d221a55a9907348d2b4fcf62"} Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.092389 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h" event={"ID":"92e6ae0d-9d2b-4fca-b7ba-16d7ffecdee1","Type":"ContainerStarted","Data":"c8032ea23c598b70b973ea40708428a41fab2160ebc69cdf4129ede47540b85c"} Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.108473 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" podStartSLOduration=99.10845856 podStartE2EDuration="1m39.10845856s" podCreationTimestamp="2026-01-31 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:52.108051291 +0000 UTC m=+122.776734759" watchObservedRunningTime="2026-01-31 09:06:52.10845856 +0000 UTC m=+122.777142017" Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.167981 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:52 crc kubenswrapper[4783]: E0131 09:06:52.168794 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:52.668780426 +0000 UTC m=+123.337463895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.194916 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7kn7h" podStartSLOduration=100.194900434 podStartE2EDuration="1m40.194900434s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:52.140691359 +0000 UTC m=+122.809374827" watchObservedRunningTime="2026-01-31 09:06:52.194900434 +0000 UTC m=+122.863583902" Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.219974 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-m54sp" podStartSLOduration=99.219957824 podStartE2EDuration="1m39.219957824s" podCreationTimestamp="2026-01-31 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:52.196829499 +0000 UTC m=+122.865512967" watchObservedRunningTime="2026-01-31 09:06:52.219957824 +0000 UTC m=+122.888641292" Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.260495 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-ljfs4" podStartSLOduration=99.260478442 podStartE2EDuration="1m39.260478442s" podCreationTimestamp="2026-01-31 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:52.259473624 +0000 UTC m=+122.928157092" watchObservedRunningTime="2026-01-31 09:06:52.260478442 +0000 UTC m=+122.929161910" Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.269411 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:52 crc kubenswrapper[4783]: E0131 09:06:52.271127 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:52.771115954 +0000 UTC m=+123.439799422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.372380 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:52 crc kubenswrapper[4783]: E0131 09:06:52.372821 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:52.872806181 +0000 UTC m=+123.541489649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.474011 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:52 crc kubenswrapper[4783]: E0131 09:06:52.474357 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:52.974343261 +0000 UTC m=+123.643026728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.574974 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:52 crc kubenswrapper[4783]: E0131 09:06:52.575116 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:53.075095085 +0000 UTC m=+123.743778553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.575225 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:52 crc kubenswrapper[4783]: E0131 09:06:52.575503 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:53.075495943 +0000 UTC m=+123.744179411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.676840 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:52 crc kubenswrapper[4783]: E0131 09:06:52.677030 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:53.177004025 +0000 UTC m=+123.845687494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.677221 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:52 crc kubenswrapper[4783]: E0131 09:06:52.677468 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:53.177456391 +0000 UTC m=+123.846139859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.778038 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:52 crc kubenswrapper[4783]: E0131 09:06:52.778219 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:53.278194359 +0000 UTC m=+123.946877828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.778277 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:52 crc kubenswrapper[4783]: E0131 09:06:52.778568 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:53.278559348 +0000 UTC m=+123.947242817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.835577 4783 patch_prober.go:28] interesting pod/router-default-5444994796-wb9bj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:06:52 crc kubenswrapper[4783]: [-]has-synced failed: reason withheld Jan 31 09:06:52 crc kubenswrapper[4783]: [+]process-running ok Jan 31 09:06:52 crc kubenswrapper[4783]: healthz check failed Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.835629 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wb9bj" podUID="9c24edcb-aeef-44a1-99b6-9e7904c41253" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.879070 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:52 crc kubenswrapper[4783]: E0131 09:06:52.879214 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:53.379196868 +0000 UTC m=+124.047880326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.941642 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-31 09:01:51 +0000 UTC, rotation deadline is 2026-11-12 22:25:20.306181623 +0000 UTC Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.941904 4783 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6853h18m27.364280548s for next certificate rotation Jan 31 09:06:52 crc kubenswrapper[4783]: I0131 09:06:52.980659 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:52 crc kubenswrapper[4783]: E0131 09:06:52.980988 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:53.480973809 +0000 UTC m=+124.149657276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.055101 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.055136 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.063390 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.081208 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:53 crc kubenswrapper[4783]: E0131 09:06:53.081365 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:53.581335797 +0000 UTC m=+124.250019265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.081446 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:53 crc kubenswrapper[4783]: E0131 09:06:53.081719 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:53.581711777 +0000 UTC m=+124.250395245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.113585 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ljfs4" event={"ID":"4b11032c-d124-4cd4-a810-90b94f3755cc","Type":"ContainerStarted","Data":"486cb0d019199cd4c9f8d2b9f68120cd44b64605eae309c3247408bd13861da2"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.118506 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" event={"ID":"a9087aa4-944f-458e-9dfc-0d2e1ee1246e","Type":"ContainerStarted","Data":"d4171094746fe03366f9d1a6861e51366e4835425c5cbd032e05ca8416fa2e02"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.118577 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.124636 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-brrd5" event={"ID":"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7","Type":"ContainerStarted","Data":"335393f9a202676ba7353256b01eee2614d5fe532323fda243f7b7a96abf1066"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.124677 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-brrd5" event={"ID":"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7","Type":"ContainerStarted","Data":"bce5454b1bc713161c1cee16731ce2d3b2bd6ab6d127e688c5744a1956caf6d5"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.125907 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lljb6" event={"ID":"4c28e17f-e541-4f08-88cf-0bd130b756cc","Type":"ContainerStarted","Data":"bde88d11dfd95c5f6b16fa3660362c2bf7c0ca9fbb6b9f72b42eb6ae7405f98d"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.125929 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lljb6" event={"ID":"4c28e17f-e541-4f08-88cf-0bd130b756cc","Type":"ContainerStarted","Data":"ecfebb2cb335a55a01a656b083a79a6e298fa70c3425dd81efe749dd82ff4a4b"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.135370 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" event={"ID":"06a9b2ff-51a8-4f3c-8424-9e2ce7df0e7f","Type":"ContainerStarted","Data":"0758a4a6611b39c9f2ed070796b2c3328254f106eb678aa8fb34d5f3ed9ed36f"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.137619 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" podStartSLOduration=101.137610656 podStartE2EDuration="1m41.137610656s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:53.137524653 +0000 UTC m=+123.806208112" watchObservedRunningTime="2026-01-31 09:06:53.137610656 +0000 UTC m=+123.806294125" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.148447 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mv84q" event={"ID":"579c1ccd-dc05-4543-9f3e-9e82915896dc","Type":"ContainerStarted","Data":"4e1ab693abf2a78ac0124f382de0c049b691d4084e46741c27c4b1862114c606"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.150925 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6pvst" event={"ID":"9343e0fc-ae90-490e-9f5a-eb1668c75226","Type":"ContainerStarted","Data":"acea3d49ac1d5bbd6da835abfafe9318a7dda871c4cecb95a3459837f4ee5fcc"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.157648 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf2tf" podStartSLOduration=100.157641437 podStartE2EDuration="1m40.157641437s" podCreationTimestamp="2026-01-31 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:53.155798935 +0000 UTC m=+123.824482404" watchObservedRunningTime="2026-01-31 09:06:53.157641437 +0000 UTC m=+123.826324905" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.158724 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" event={"ID":"593183b0-1bb3-42d1-8949-d1b56f0ac114","Type":"ContainerStarted","Data":"e1965ea11957c03f51f7e7bc761d981decf5cd4a77d9b47916cca5e75a30967b"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.162374 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-55h6m" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.164403 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-n9kcz" event={"ID":"40a6c5ba-0bc2-4e23-b92b-8486e77001ae","Type":"ContainerStarted","Data":"47d917cf10d3503d67bf649bbca22bafc781192269614238b2aa3cb6eeaf5fc9"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.164432 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-n9kcz" event={"ID":"40a6c5ba-0bc2-4e23-b92b-8486e77001ae","Type":"ContainerStarted","Data":"b2d4fde7cf7e2f382785ca02b123078521b029c8e974d4f1a9a8aaf1b8400f92"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.171417 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" event={"ID":"cbbd545d-5fde-4209-84e8-9252737745c4","Type":"ContainerStarted","Data":"10e5ec7f34c9b66e24151f3d67bc06c44ae3440014ef98d017cae0b9d1e116dc"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.171909 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.179485 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" event={"ID":"c7522dc2-2021-4ca3-8ece-f051f1149e61","Type":"ContainerStarted","Data":"6f90df32532e624b78866ace02dfe8710caad29d1f96af12f9fb2212d7eccc67"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.181511 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lljb6" podStartSLOduration=100.181482849 podStartE2EDuration="1m40.181482849s" podCreationTimestamp="2026-01-31 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:53.180626391 +0000 UTC m=+123.849309859" watchObservedRunningTime="2026-01-31 09:06:53.181482849 +0000 UTC m=+123.850166317" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.182636 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:53 crc kubenswrapper[4783]: E0131 09:06:53.182946 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:53.682930564 +0000 UTC m=+124.351614033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.199592 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-88hqp" event={"ID":"ee769697-49da-45f8-824a-583871b70123","Type":"ContainerStarted","Data":"9a4827a87b2a88e31c8ad76b46cb31958d21f4a31b10489f8b6c63b2f8d760e3"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.217284 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" event={"ID":"7cf20dc6-2184-41b1-a943-f917dafb36b4","Type":"ContainerStarted","Data":"7e22bbd0ae25dff0c4c1ae3463e4f50a7aeba0f65fd5e78b2a8a09658f0dc543"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.219494 4783 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-lkj9z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.219551 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" podUID="7cf20dc6-2184-41b1-a943-f917dafb36b4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.230426 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnzfn" event={"ID":"6e2e303d-86eb-4b59-bfde-b8bfccb3ae65","Type":"ContainerStarted","Data":"edeeb03e42dba863eab11249cd839ab1d0b04e39fb7050517fecdbe6549fb6b1"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.237879 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" podStartSLOduration=101.237862807 podStartE2EDuration="1m41.237862807s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:53.236621781 +0000 UTC m=+123.905305250" watchObservedRunningTime="2026-01-31 09:06:53.237862807 +0000 UTC m=+123.906546275" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.239807 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl" event={"ID":"44c20261-8b8d-4fe3-9ae7-a07a46eafac8","Type":"ContainerStarted","Data":"5a82e0d1a0002773b3d0a3da7962cd5fa6a18b34619b2c03830fcfe060c20488"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.239883 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl" event={"ID":"44c20261-8b8d-4fe3-9ae7-a07a46eafac8","Type":"ContainerStarted","Data":"a9e6ce65d12601e844ffddee5b998b443c74fe8e824f25186f54b18c03cf46f6"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.239974 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.246356 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" event={"ID":"94e21399-452d-4820-8683-6536189c56c0","Type":"ContainerStarted","Data":"fa605da08d929691be164d191744a3cdc58b16b9adb891cd851a378ce79e99c5"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.246402 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" event={"ID":"94e21399-452d-4820-8683-6536189c56c0","Type":"ContainerStarted","Data":"e5a4e1e558f95d0dbeff75b6fb10374ea02d9d6272dc86da07f925b858ec39cc"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.249443 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mv84q" podStartSLOduration=101.249421027 podStartE2EDuration="1m41.249421027s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:53.249149253 +0000 UTC m=+123.917832722" watchObservedRunningTime="2026-01-31 09:06:53.249421027 +0000 UTC m=+123.918104495" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.255380 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sm9jn" event={"ID":"a57e8066-1fe2-4664-9415-fa8a8b6621c5","Type":"ContainerStarted","Data":"ffd6a9ec27c5484679d581b36542e08deb6b96c13603dfbfdbb4587b98b97f57"} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.276389 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jtdqd" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.278567 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" podStartSLOduration=100.278549009 podStartE2EDuration="1m40.278549009s" podCreationTimestamp="2026-01-31 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:53.276267848 +0000 UTC m=+123.944951326" watchObservedRunningTime="2026-01-31 09:06:53.278549009 +0000 UTC m=+123.947232477" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.284336 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:53 crc kubenswrapper[4783]: E0131 09:06:53.286145 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:53.78613308 +0000 UTC m=+124.454816548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.304963 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-n9kcz" podStartSLOduration=100.304944266 podStartE2EDuration="1m40.304944266s" podCreationTimestamp="2026-01-31 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:53.303421589 +0000 UTC m=+123.972105057" watchObservedRunningTime="2026-01-31 09:06:53.304944266 +0000 UTC m=+123.973627734" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.338871 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xxtnp"] Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.339782 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.345410 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.350959 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl" podStartSLOduration=100.350947056 podStartE2EDuration="1m40.350947056s" podCreationTimestamp="2026-01-31 09:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:53.349728903 +0000 UTC m=+124.018412371" watchObservedRunningTime="2026-01-31 09:06:53.350947056 +0000 UTC m=+124.019630524" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.363780 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xxtnp"] Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.385325 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:53 crc kubenswrapper[4783]: E0131 09:06:53.386981 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:53.88696206 +0000 UTC m=+124.555645528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.413034 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-88hqp" podStartSLOduration=8.413016223 podStartE2EDuration="8.413016223s" podCreationTimestamp="2026-01-31 09:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:53.37430322 +0000 UTC m=+124.042986687" watchObservedRunningTime="2026-01-31 09:06:53.413016223 +0000 UTC m=+124.081699692" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.446536 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4n7fz" podStartSLOduration=101.446517198 podStartE2EDuration="1m41.446517198s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:53.415566422 +0000 UTC m=+124.084249890" watchObservedRunningTime="2026-01-31 09:06:53.446517198 +0000 UTC m=+124.115200665" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.453229 4783 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.486886 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-utilities\") pod \"community-operators-xxtnp\" (UID: \"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e\") " pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.486948 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9t6g\" (UniqueName: \"kubernetes.io/projected/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-kube-api-access-p9t6g\") pod \"community-operators-xxtnp\" (UID: \"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e\") " pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.486986 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-catalog-content\") pod \"community-operators-xxtnp\" (UID: \"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e\") " pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.487014 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:53 crc kubenswrapper[4783]: E0131 09:06:53.487296 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:53.987285104 +0000 UTC m=+124.655968572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.543032 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2whxk"] Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.543865 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.546421 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.550743 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.551028 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.572530 4783 patch_prober.go:28] interesting pod/apiserver-76f77b778f-7h6ff container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 31 09:06:53 crc kubenswrapper[4783]: [+]log ok Jan 31 09:06:53 crc kubenswrapper[4783]: [+]etcd ok Jan 31 09:06:53 crc kubenswrapper[4783]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 31 09:06:53 crc kubenswrapper[4783]: [+]poststarthook/generic-apiserver-start-informers ok Jan 31 09:06:53 crc kubenswrapper[4783]: [+]poststarthook/max-in-flight-filter ok Jan 31 09:06:53 crc kubenswrapper[4783]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 31 09:06:53 crc kubenswrapper[4783]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 31 09:06:53 crc kubenswrapper[4783]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 31 09:06:53 crc kubenswrapper[4783]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 31 09:06:53 crc kubenswrapper[4783]: [+]poststarthook/project.openshift.io-projectcache ok Jan 31 09:06:53 crc kubenswrapper[4783]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 31 09:06:53 crc kubenswrapper[4783]: [+]poststarthook/openshift.io-startinformers ok Jan 31 09:06:53 crc kubenswrapper[4783]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 31 09:06:53 crc kubenswrapper[4783]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 31 09:06:53 crc kubenswrapper[4783]: livez check failed Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.572576 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" podUID="c7522dc2-2021-4ca3-8ece-f051f1149e61" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.588850 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.589150 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-utilities\") pod \"community-operators-xxtnp\" (UID: \"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e\") " pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.589206 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9t6g\" (UniqueName: \"kubernetes.io/projected/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-kube-api-access-p9t6g\") pod \"community-operators-xxtnp\" (UID: \"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e\") " pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.589256 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-catalog-content\") pod \"community-operators-xxtnp\" (UID: \"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e\") " pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.589702 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-catalog-content\") pod \"community-operators-xxtnp\" (UID: \"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e\") " pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:06:53 crc kubenswrapper[4783]: E0131 09:06:53.589800 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:06:54.089786214 +0000 UTC m=+124.758469683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.589908 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g68tl" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.590201 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-utilities\") pod \"community-operators-xxtnp\" (UID: \"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e\") " pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.590663 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2whxk"] Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.618358 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9t6g\" (UniqueName: \"kubernetes.io/projected/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-kube-api-access-p9t6g\") pod \"community-operators-xxtnp\" (UID: \"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e\") " pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.659194 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.690633 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-catalog-content\") pod \"certified-operators-2whxk\" (UID: \"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e\") " pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.690687 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.690720 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlgjm\" (UniqueName: \"kubernetes.io/projected/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-kube-api-access-rlgjm\") pod \"certified-operators-2whxk\" (UID: \"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e\") " pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.690747 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-utilities\") pod \"certified-operators-2whxk\" (UID: \"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e\") " pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:06:53 crc kubenswrapper[4783]: E0131 09:06:53.691033 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:06:54.19102008 +0000 UTC m=+124.859703548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mx2nq" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.744913 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tzbx8"] Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.745861 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.757599 4783 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-31T09:06:53.453248959Z","Handler":null,"Name":""} Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.767343 4783 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.767373 4783 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.769142 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tzbx8"] Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.792823 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.792996 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-utilities\") pod \"certified-operators-2whxk\" (UID: \"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e\") " pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.793071 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-catalog-content\") pod \"certified-operators-2whxk\" (UID: \"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e\") " pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.793117 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlgjm\" (UniqueName: \"kubernetes.io/projected/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-kube-api-access-rlgjm\") pod \"certified-operators-2whxk\" (UID: \"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e\") " pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.793932 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-utilities\") pod \"certified-operators-2whxk\" (UID: \"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e\") " pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.794009 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-catalog-content\") pod \"certified-operators-2whxk\" (UID: \"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e\") " pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.797566 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.810771 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlgjm\" (UniqueName: \"kubernetes.io/projected/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-kube-api-access-rlgjm\") pod \"certified-operators-2whxk\" (UID: \"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e\") " pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.834951 4783 patch_prober.go:28] interesting pod/router-default-5444994796-wb9bj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:06:53 crc kubenswrapper[4783]: [-]has-synced failed: reason withheld Jan 31 09:06:53 crc kubenswrapper[4783]: [+]process-running ok Jan 31 09:06:53 crc kubenswrapper[4783]: healthz check failed Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.835017 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wb9bj" podUID="9c24edcb-aeef-44a1-99b6-9e7904c41253" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.867000 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.893962 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e052fd63-7a83-423f-84f9-6591c58046ce-catalog-content\") pod \"community-operators-tzbx8\" (UID: \"e052fd63-7a83-423f-84f9-6591c58046ce\") " pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.894064 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk4hz\" (UniqueName: \"kubernetes.io/projected/e052fd63-7a83-423f-84f9-6591c58046ce-kube-api-access-zk4hz\") pod \"community-operators-tzbx8\" (UID: \"e052fd63-7a83-423f-84f9-6591c58046ce\") " pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.894091 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.894138 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e052fd63-7a83-423f-84f9-6591c58046ce-utilities\") pod \"community-operators-tzbx8\" (UID: \"e052fd63-7a83-423f-84f9-6591c58046ce\") " pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.901321 4783 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.901353 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.938276 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pd4w8"] Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.939030 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.953720 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pd4w8"] Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.968615 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xxtnp"] Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.969053 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mx2nq\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.994718 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk4hz\" (UniqueName: \"kubernetes.io/projected/e052fd63-7a83-423f-84f9-6591c58046ce-kube-api-access-zk4hz\") pod \"community-operators-tzbx8\" (UID: \"e052fd63-7a83-423f-84f9-6591c58046ce\") " pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.994771 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e052fd63-7a83-423f-84f9-6591c58046ce-utilities\") pod \"community-operators-tzbx8\" (UID: \"e052fd63-7a83-423f-84f9-6591c58046ce\") " pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.994820 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e052fd63-7a83-423f-84f9-6591c58046ce-catalog-content\") pod \"community-operators-tzbx8\" (UID: \"e052fd63-7a83-423f-84f9-6591c58046ce\") " pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.995443 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e052fd63-7a83-423f-84f9-6591c58046ce-catalog-content\") pod \"community-operators-tzbx8\" (UID: \"e052fd63-7a83-423f-84f9-6591c58046ce\") " pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:06:53 crc kubenswrapper[4783]: I0131 09:06:53.995667 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e052fd63-7a83-423f-84f9-6591c58046ce-utilities\") pod \"community-operators-tzbx8\" (UID: \"e052fd63-7a83-423f-84f9-6591c58046ce\") " pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.029939 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk4hz\" (UniqueName: \"kubernetes.io/projected/e052fd63-7a83-423f-84f9-6591c58046ce-kube-api-access-zk4hz\") pod \"community-operators-tzbx8\" (UID: \"e052fd63-7a83-423f-84f9-6591c58046ce\") " pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.070767 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.086035 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.093323 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.096838 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bttnl\" (UniqueName: \"kubernetes.io/projected/cd47c3c5-ff77-4c21-b855-820a1aa46d05-kube-api-access-bttnl\") pod \"certified-operators-pd4w8\" (UID: \"cd47c3c5-ff77-4c21-b855-820a1aa46d05\") " pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.096928 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd47c3c5-ff77-4c21-b855-820a1aa46d05-catalog-content\") pod \"certified-operators-pd4w8\" (UID: \"cd47c3c5-ff77-4c21-b855-820a1aa46d05\") " pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.096972 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd47c3c5-ff77-4c21-b855-820a1aa46d05-utilities\") pod \"certified-operators-pd4w8\" (UID: \"cd47c3c5-ff77-4c21-b855-820a1aa46d05\") " pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.152883 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2whxk"] Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.198825 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd47c3c5-ff77-4c21-b855-820a1aa46d05-utilities\") pod \"certified-operators-pd4w8\" (UID: \"cd47c3c5-ff77-4c21-b855-820a1aa46d05\") " pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.199176 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bttnl\" (UniqueName: \"kubernetes.io/projected/cd47c3c5-ff77-4c21-b855-820a1aa46d05-kube-api-access-bttnl\") pod \"certified-operators-pd4w8\" (UID: \"cd47c3c5-ff77-4c21-b855-820a1aa46d05\") " pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.199202 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd47c3c5-ff77-4c21-b855-820a1aa46d05-catalog-content\") pod \"certified-operators-pd4w8\" (UID: \"cd47c3c5-ff77-4c21-b855-820a1aa46d05\") " pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.199547 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd47c3c5-ff77-4c21-b855-820a1aa46d05-catalog-content\") pod \"certified-operators-pd4w8\" (UID: \"cd47c3c5-ff77-4c21-b855-820a1aa46d05\") " pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.199828 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd47c3c5-ff77-4c21-b855-820a1aa46d05-utilities\") pod \"certified-operators-pd4w8\" (UID: \"cd47c3c5-ff77-4c21-b855-820a1aa46d05\") " pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:06:54 crc kubenswrapper[4783]: W0131 09:06:54.199832 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67f8d4b0_8393_4fbc_bb6d_f7d321645e9e.slice/crio-446112daa90169d02c98ba6acb57fb896c72c238703654bf17158e7a68a395b6 WatchSource:0}: Error finding container 446112daa90169d02c98ba6acb57fb896c72c238703654bf17158e7a68a395b6: Status 404 returned error can't find the container with id 446112daa90169d02c98ba6acb57fb896c72c238703654bf17158e7a68a395b6 Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.213936 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bttnl\" (UniqueName: \"kubernetes.io/projected/cd47c3c5-ff77-4c21-b855-820a1aa46d05-kube-api-access-bttnl\") pod \"certified-operators-pd4w8\" (UID: \"cd47c3c5-ff77-4c21-b855-820a1aa46d05\") " pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.262737 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tzbx8"] Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.264416 4783 generic.go:334] "Generic (PLEG): container finished" podID="58cd7170-ffdf-4cf8-9d2c-4e0251ada36e" containerID="913b3b3602379ff062311a2c4275b0559d5d9ee04d03e80b0b071cd964757fd5" exitCode=0 Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.264519 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxtnp" event={"ID":"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e","Type":"ContainerDied","Data":"913b3b3602379ff062311a2c4275b0559d5d9ee04d03e80b0b071cd964757fd5"} Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.264573 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxtnp" event={"ID":"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e","Type":"ContainerStarted","Data":"b07faea632ef4e32fa16a49de06c4dfb6c79aa876b198b86ad9b2275ec286fbc"} Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.266656 4783 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.268193 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-brrd5" event={"ID":"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7","Type":"ContainerStarted","Data":"2c02303ad3639b354278d15c3969587c0a34ed96a17622bbbb20c8134f868b9a"} Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.268388 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-brrd5" event={"ID":"c2ff120c-221e-4dcb-a51f-9c53c3fe25c7","Type":"ContainerStarted","Data":"b3a10ab9e8c2115d05871f245f56486eadf234b4c9c23c8322c76ae2a71b272a"} Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.268572 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.271605 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2whxk" event={"ID":"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e","Type":"ContainerStarted","Data":"446112daa90169d02c98ba6acb57fb896c72c238703654bf17158e7a68a395b6"} Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.276932 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.479905 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-brrd5" podStartSLOduration=9.479873969 podStartE2EDuration="9.479873969s" podCreationTimestamp="2026-01-31 09:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:54.322319939 +0000 UTC m=+124.991003407" watchObservedRunningTime="2026-01-31 09:06:54.479873969 +0000 UTC m=+125.148557438" Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.481202 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pd4w8"] Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.490744 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mx2nq"] Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.844240 4783 patch_prober.go:28] interesting pod/router-default-5444994796-wb9bj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:06:54 crc kubenswrapper[4783]: [-]has-synced failed: reason withheld Jan 31 09:06:54 crc kubenswrapper[4783]: [+]process-running ok Jan 31 09:06:54 crc kubenswrapper[4783]: healthz check failed Jan 31 09:06:54 crc kubenswrapper[4783]: I0131 09:06:54.844357 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wb9bj" podUID="9c24edcb-aeef-44a1-99b6-9e7904c41253" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.277853 4783 generic.go:334] "Generic (PLEG): container finished" podID="f01828fd-fba3-487c-a2c6-f5599e1c379d" containerID="4ade95b367c1495ae5ec997fab2d073aca17ec4cedaa325800e7e230d740ec37" exitCode=0 Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.277945 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" event={"ID":"f01828fd-fba3-487c-a2c6-f5599e1c379d","Type":"ContainerDied","Data":"4ade95b367c1495ae5ec997fab2d073aca17ec4cedaa325800e7e230d740ec37"} Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.280714 4783 generic.go:334] "Generic (PLEG): container finished" podID="67f8d4b0-8393-4fbc-bb6d-f7d321645e9e" containerID="50d495110117ea5b3777dfc01ba245a030f62ffb6f0c39fba98e19f16932bc95" exitCode=0 Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.280905 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2whxk" event={"ID":"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e","Type":"ContainerDied","Data":"50d495110117ea5b3777dfc01ba245a030f62ffb6f0c39fba98e19f16932bc95"} Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.282262 4783 generic.go:334] "Generic (PLEG): container finished" podID="e052fd63-7a83-423f-84f9-6591c58046ce" containerID="f51f263d104fcdf96f9282abb5f262b97f8ac13bf2bfa50d60cbbfa437072247" exitCode=0 Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.282308 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzbx8" event={"ID":"e052fd63-7a83-423f-84f9-6591c58046ce","Type":"ContainerDied","Data":"f51f263d104fcdf96f9282abb5f262b97f8ac13bf2bfa50d60cbbfa437072247"} Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.282322 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzbx8" event={"ID":"e052fd63-7a83-423f-84f9-6591c58046ce","Type":"ContainerStarted","Data":"8b096c0a81bb9c716a87edb3d254995627dd3a1814cd01ecbfee923183d20af7"} Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.284356 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" event={"ID":"27f9649b-142f-44ae-9e1b-8a6d8026ddcc","Type":"ContainerStarted","Data":"290501d0b45d53b04a9f99b1f6a654539b3e61853feca6c94a306f56889e66b6"} Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.284374 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" event={"ID":"27f9649b-142f-44ae-9e1b-8a6d8026ddcc","Type":"ContainerStarted","Data":"8ba864220b34e169768e77b918d44399857128b829714cd4b5e929c16d186eaf"} Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.284759 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.289977 4783 generic.go:334] "Generic (PLEG): container finished" podID="cd47c3c5-ff77-4c21-b855-820a1aa46d05" containerID="300432cd73bcf07b59f726e1af2efa4512141633333326b469d9bda18eaf8aee" exitCode=0 Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.290256 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pd4w8" event={"ID":"cd47c3c5-ff77-4c21-b855-820a1aa46d05","Type":"ContainerDied","Data":"300432cd73bcf07b59f726e1af2efa4512141633333326b469d9bda18eaf8aee"} Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.290276 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pd4w8" event={"ID":"cd47c3c5-ff77-4c21-b855-820a1aa46d05","Type":"ContainerStarted","Data":"17667b8f68e533857087dd61ea7492ec9ef2a768ce24fb1ca3ac1c6872dbc4f5"} Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.326994 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" podStartSLOduration=103.326972319 podStartE2EDuration="1m43.326972319s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:55.326471634 +0000 UTC m=+125.995155101" watchObservedRunningTime="2026-01-31 09:06:55.326972319 +0000 UTC m=+125.995655788" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.531311 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s4h5q"] Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.532884 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.533282 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4h5q"] Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.536315 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.618307 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0fca12d-3abf-4543-b5f7-205f4bd75149-utilities\") pod \"redhat-marketplace-s4h5q\" (UID: \"a0fca12d-3abf-4543-b5f7-205f4bd75149\") " pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.618547 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0fca12d-3abf-4543-b5f7-205f4bd75149-catalog-content\") pod \"redhat-marketplace-s4h5q\" (UID: \"a0fca12d-3abf-4543-b5f7-205f4bd75149\") " pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.618672 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4cf4\" (UniqueName: \"kubernetes.io/projected/a0fca12d-3abf-4543-b5f7-205f4bd75149-kube-api-access-l4cf4\") pod \"redhat-marketplace-s4h5q\" (UID: \"a0fca12d-3abf-4543-b5f7-205f4bd75149\") " pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.654015 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.720684 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0fca12d-3abf-4543-b5f7-205f4bd75149-catalog-content\") pod \"redhat-marketplace-s4h5q\" (UID: \"a0fca12d-3abf-4543-b5f7-205f4bd75149\") " pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.720867 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4cf4\" (UniqueName: \"kubernetes.io/projected/a0fca12d-3abf-4543-b5f7-205f4bd75149-kube-api-access-l4cf4\") pod \"redhat-marketplace-s4h5q\" (UID: \"a0fca12d-3abf-4543-b5f7-205f4bd75149\") " pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.721630 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0fca12d-3abf-4543-b5f7-205f4bd75149-utilities\") pod \"redhat-marketplace-s4h5q\" (UID: \"a0fca12d-3abf-4543-b5f7-205f4bd75149\") " pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.721377 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0fca12d-3abf-4543-b5f7-205f4bd75149-catalog-content\") pod \"redhat-marketplace-s4h5q\" (UID: \"a0fca12d-3abf-4543-b5f7-205f4bd75149\") " pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.722427 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0fca12d-3abf-4543-b5f7-205f4bd75149-utilities\") pod \"redhat-marketplace-s4h5q\" (UID: \"a0fca12d-3abf-4543-b5f7-205f4bd75149\") " pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.739848 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4cf4\" (UniqueName: \"kubernetes.io/projected/a0fca12d-3abf-4543-b5f7-205f4bd75149-kube-api-access-l4cf4\") pod \"redhat-marketplace-s4h5q\" (UID: \"a0fca12d-3abf-4543-b5f7-205f4bd75149\") " pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.836468 4783 patch_prober.go:28] interesting pod/router-default-5444994796-wb9bj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:06:55 crc kubenswrapper[4783]: [-]has-synced failed: reason withheld Jan 31 09:06:55 crc kubenswrapper[4783]: [+]process-running ok Jan 31 09:06:55 crc kubenswrapper[4783]: healthz check failed Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.836557 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wb9bj" podUID="9c24edcb-aeef-44a1-99b6-9e7904c41253" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.854374 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.926419 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wlxlx"] Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.927843 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:06:55 crc kubenswrapper[4783]: I0131 09:06:55.937760 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlxlx"] Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.026985 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76827fef-ee7e-476c-82d3-8c43754e04d9-utilities\") pod \"redhat-marketplace-wlxlx\" (UID: \"76827fef-ee7e-476c-82d3-8c43754e04d9\") " pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.027302 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76827fef-ee7e-476c-82d3-8c43754e04d9-catalog-content\") pod \"redhat-marketplace-wlxlx\" (UID: \"76827fef-ee7e-476c-82d3-8c43754e04d9\") " pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.027339 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ct72\" (UniqueName: \"kubernetes.io/projected/76827fef-ee7e-476c-82d3-8c43754e04d9-kube-api-access-4ct72\") pod \"redhat-marketplace-wlxlx\" (UID: \"76827fef-ee7e-476c-82d3-8c43754e04d9\") " pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.128941 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76827fef-ee7e-476c-82d3-8c43754e04d9-catalog-content\") pod \"redhat-marketplace-wlxlx\" (UID: \"76827fef-ee7e-476c-82d3-8c43754e04d9\") " pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.128988 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ct72\" (UniqueName: \"kubernetes.io/projected/76827fef-ee7e-476c-82d3-8c43754e04d9-kube-api-access-4ct72\") pod \"redhat-marketplace-wlxlx\" (UID: \"76827fef-ee7e-476c-82d3-8c43754e04d9\") " pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.129022 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76827fef-ee7e-476c-82d3-8c43754e04d9-utilities\") pod \"redhat-marketplace-wlxlx\" (UID: \"76827fef-ee7e-476c-82d3-8c43754e04d9\") " pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.130093 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76827fef-ee7e-476c-82d3-8c43754e04d9-utilities\") pod \"redhat-marketplace-wlxlx\" (UID: \"76827fef-ee7e-476c-82d3-8c43754e04d9\") " pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.130120 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76827fef-ee7e-476c-82d3-8c43754e04d9-catalog-content\") pod \"redhat-marketplace-wlxlx\" (UID: \"76827fef-ee7e-476c-82d3-8c43754e04d9\") " pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.162891 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ct72\" (UniqueName: \"kubernetes.io/projected/76827fef-ee7e-476c-82d3-8c43754e04d9-kube-api-access-4ct72\") pod \"redhat-marketplace-wlxlx\" (UID: \"76827fef-ee7e-476c-82d3-8c43754e04d9\") " pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.243359 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4h5q"] Jan 31 09:06:56 crc kubenswrapper[4783]: W0131 09:06:56.253952 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0fca12d_3abf_4543_b5f7_205f4bd75149.slice/crio-d86ce69f65e312dcc15bef8fcda51e894c5d368c5e1e65986a4fa51d4e51b238 WatchSource:0}: Error finding container d86ce69f65e312dcc15bef8fcda51e894c5d368c5e1e65986a4fa51d4e51b238: Status 404 returned error can't find the container with id d86ce69f65e312dcc15bef8fcda51e894c5d368c5e1e65986a4fa51d4e51b238 Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.261126 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.310398 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4h5q" event={"ID":"a0fca12d-3abf-4543-b5f7-205f4bd75149","Type":"ContainerStarted","Data":"d86ce69f65e312dcc15bef8fcda51e894c5d368c5e1e65986a4fa51d4e51b238"} Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.531737 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zzqtt"] Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.534673 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.537045 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zzqtt"] Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.538550 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.542857 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.641976 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f01828fd-fba3-487c-a2c6-f5599e1c379d-secret-volume\") pod \"f01828fd-fba3-487c-a2c6-f5599e1c379d\" (UID: \"f01828fd-fba3-487c-a2c6-f5599e1c379d\") " Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.642147 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f01828fd-fba3-487c-a2c6-f5599e1c379d-config-volume\") pod \"f01828fd-fba3-487c-a2c6-f5599e1c379d\" (UID: \"f01828fd-fba3-487c-a2c6-f5599e1c379d\") " Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.642264 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdnr4\" (UniqueName: \"kubernetes.io/projected/f01828fd-fba3-487c-a2c6-f5599e1c379d-kube-api-access-tdnr4\") pod \"f01828fd-fba3-487c-a2c6-f5599e1c379d\" (UID: \"f01828fd-fba3-487c-a2c6-f5599e1c379d\") " Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.642678 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc9fk\" (UniqueName: \"kubernetes.io/projected/108dfc0b-86ff-45c1-8d9f-a879d585ddff-kube-api-access-cc9fk\") pod \"redhat-operators-zzqtt\" (UID: \"108dfc0b-86ff-45c1-8d9f-a879d585ddff\") " pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.642678 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f01828fd-fba3-487c-a2c6-f5599e1c379d-config-volume" (OuterVolumeSpecName: "config-volume") pod "f01828fd-fba3-487c-a2c6-f5599e1c379d" (UID: "f01828fd-fba3-487c-a2c6-f5599e1c379d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.642902 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/108dfc0b-86ff-45c1-8d9f-a879d585ddff-catalog-content\") pod \"redhat-operators-zzqtt\" (UID: \"108dfc0b-86ff-45c1-8d9f-a879d585ddff\") " pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.642964 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/108dfc0b-86ff-45c1-8d9f-a879d585ddff-utilities\") pod \"redhat-operators-zzqtt\" (UID: \"108dfc0b-86ff-45c1-8d9f-a879d585ddff\") " pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.643051 4783 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f01828fd-fba3-487c-a2c6-f5599e1c379d-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.646781 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f01828fd-fba3-487c-a2c6-f5599e1c379d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f01828fd-fba3-487c-a2c6-f5599e1c379d" (UID: "f01828fd-fba3-487c-a2c6-f5599e1c379d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.647238 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f01828fd-fba3-487c-a2c6-f5599e1c379d-kube-api-access-tdnr4" (OuterVolumeSpecName: "kube-api-access-tdnr4") pod "f01828fd-fba3-487c-a2c6-f5599e1c379d" (UID: "f01828fd-fba3-487c-a2c6-f5599e1c379d"). InnerVolumeSpecName "kube-api-access-tdnr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.726267 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlxlx"] Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.744135 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc9fk\" (UniqueName: \"kubernetes.io/projected/108dfc0b-86ff-45c1-8d9f-a879d585ddff-kube-api-access-cc9fk\") pod \"redhat-operators-zzqtt\" (UID: \"108dfc0b-86ff-45c1-8d9f-a879d585ddff\") " pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.744257 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/108dfc0b-86ff-45c1-8d9f-a879d585ddff-catalog-content\") pod \"redhat-operators-zzqtt\" (UID: \"108dfc0b-86ff-45c1-8d9f-a879d585ddff\") " pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.744307 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/108dfc0b-86ff-45c1-8d9f-a879d585ddff-utilities\") pod \"redhat-operators-zzqtt\" (UID: \"108dfc0b-86ff-45c1-8d9f-a879d585ddff\") " pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.744463 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdnr4\" (UniqueName: \"kubernetes.io/projected/f01828fd-fba3-487c-a2c6-f5599e1c379d-kube-api-access-tdnr4\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.744482 4783 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f01828fd-fba3-487c-a2c6-f5599e1c379d-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.744900 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/108dfc0b-86ff-45c1-8d9f-a879d585ddff-catalog-content\") pod \"redhat-operators-zzqtt\" (UID: \"108dfc0b-86ff-45c1-8d9f-a879d585ddff\") " pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.744904 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/108dfc0b-86ff-45c1-8d9f-a879d585ddff-utilities\") pod \"redhat-operators-zzqtt\" (UID: \"108dfc0b-86ff-45c1-8d9f-a879d585ddff\") " pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.759357 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc9fk\" (UniqueName: \"kubernetes.io/projected/108dfc0b-86ff-45c1-8d9f-a879d585ddff-kube-api-access-cc9fk\") pod \"redhat-operators-zzqtt\" (UID: \"108dfc0b-86ff-45c1-8d9f-a879d585ddff\") " pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.835743 4783 patch_prober.go:28] interesting pod/router-default-5444994796-wb9bj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:06:56 crc kubenswrapper[4783]: [-]has-synced failed: reason withheld Jan 31 09:06:56 crc kubenswrapper[4783]: [+]process-running ok Jan 31 09:06:56 crc kubenswrapper[4783]: healthz check failed Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.835800 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wb9bj" podUID="9c24edcb-aeef-44a1-99b6-9e7904c41253" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.848791 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.935759 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xwzlk"] Jan 31 09:06:56 crc kubenswrapper[4783]: E0131 09:06:56.936312 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01828fd-fba3-487c-a2c6-f5599e1c379d" containerName="collect-profiles" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.936325 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01828fd-fba3-487c-a2c6-f5599e1c379d" containerName="collect-profiles" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.936437 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="f01828fd-fba3-487c-a2c6-f5599e1c379d" containerName="collect-profiles" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.937145 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:06:56 crc kubenswrapper[4783]: I0131 09:06:56.938182 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xwzlk"] Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.053210 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-utilities\") pod \"redhat-operators-xwzlk\" (UID: \"c3584910-fcf1-4d1d-a15c-8f4bc1a77809\") " pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.053257 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n777v\" (UniqueName: \"kubernetes.io/projected/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-kube-api-access-n777v\") pod \"redhat-operators-xwzlk\" (UID: \"c3584910-fcf1-4d1d-a15c-8f4bc1a77809\") " pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.054107 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-catalog-content\") pod \"redhat-operators-xwzlk\" (UID: \"c3584910-fcf1-4d1d-a15c-8f4bc1a77809\") " pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.117566 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.119589 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.122096 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.122639 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.127784 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.155845 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-utilities\") pod \"redhat-operators-xwzlk\" (UID: \"c3584910-fcf1-4d1d-a15c-8f4bc1a77809\") " pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.155884 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n777v\" (UniqueName: \"kubernetes.io/projected/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-kube-api-access-n777v\") pod \"redhat-operators-xwzlk\" (UID: \"c3584910-fcf1-4d1d-a15c-8f4bc1a77809\") " pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.155996 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-catalog-content\") pod \"redhat-operators-xwzlk\" (UID: \"c3584910-fcf1-4d1d-a15c-8f4bc1a77809\") " pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.157134 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-utilities\") pod \"redhat-operators-xwzlk\" (UID: \"c3584910-fcf1-4d1d-a15c-8f4bc1a77809\") " pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.157426 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-catalog-content\") pod \"redhat-operators-xwzlk\" (UID: \"c3584910-fcf1-4d1d-a15c-8f4bc1a77809\") " pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.178902 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n777v\" (UniqueName: \"kubernetes.io/projected/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-kube-api-access-n777v\") pod \"redhat-operators-xwzlk\" (UID: \"c3584910-fcf1-4d1d-a15c-8f4bc1a77809\") " pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.248956 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.257142 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d2558ed-aac8-452b-a38f-948fbac1e1dd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6d2558ed-aac8-452b-a38f-948fbac1e1dd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.257202 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d2558ed-aac8-452b-a38f-948fbac1e1dd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6d2558ed-aac8-452b-a38f-948fbac1e1dd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.327862 4783 generic.go:334] "Generic (PLEG): container finished" podID="a0fca12d-3abf-4543-b5f7-205f4bd75149" containerID="f819f97d674615c7a8d6abacb2cf72bae2e73c6aac7ca53d1e9bbba4a52793ec" exitCode=0 Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.327947 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4h5q" event={"ID":"a0fca12d-3abf-4543-b5f7-205f4bd75149","Type":"ContainerDied","Data":"f819f97d674615c7a8d6abacb2cf72bae2e73c6aac7ca53d1e9bbba4a52793ec"} Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.331292 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" event={"ID":"f01828fd-fba3-487c-a2c6-f5599e1c379d","Type":"ContainerDied","Data":"d4a398d6bb0e2b270902fb8b1cce54ee0f559638412338b6be0ad72c54209476"} Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.331332 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4a398d6bb0e2b270902fb8b1cce54ee0f559638412338b6be0ad72c54209476" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.331375 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.358699 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d2558ed-aac8-452b-a38f-948fbac1e1dd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6d2558ed-aac8-452b-a38f-948fbac1e1dd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.358960 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d2558ed-aac8-452b-a38f-948fbac1e1dd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6d2558ed-aac8-452b-a38f-948fbac1e1dd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.358730 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d2558ed-aac8-452b-a38f-948fbac1e1dd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6d2558ed-aac8-452b-a38f-948fbac1e1dd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.371630 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d2558ed-aac8-452b-a38f-948fbac1e1dd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6d2558ed-aac8-452b-a38f-948fbac1e1dd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.444721 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.676010 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wqsnf" Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.835702 4783 patch_prober.go:28] interesting pod/router-default-5444994796-wb9bj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:06:57 crc kubenswrapper[4783]: [-]has-synced failed: reason withheld Jan 31 09:06:57 crc kubenswrapper[4783]: [+]process-running ok Jan 31 09:06:57 crc kubenswrapper[4783]: healthz check failed Jan 31 09:06:57 crc kubenswrapper[4783]: I0131 09:06:57.835750 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wb9bj" podUID="9c24edcb-aeef-44a1-99b6-9e7904c41253" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.034088 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.034139 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.035873 4783 patch_prober.go:28] interesting pod/console-f9d7485db-xjwbp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.035946 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-xjwbp" podUID="1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.047420 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-9chh8" Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.554439 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.558528 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-7h6ff" Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.833695 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.835828 4783 patch_prober.go:28] interesting pod/router-default-5444994796-wb9bj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:06:58 crc kubenswrapper[4783]: [-]has-synced failed: reason withheld Jan 31 09:06:58 crc kubenswrapper[4783]: [+]process-running ok Jan 31 09:06:58 crc kubenswrapper[4783]: healthz check failed Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.835904 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wb9bj" podUID="9c24edcb-aeef-44a1-99b6-9e7904c41253" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.876916 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.877852 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.880845 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.881800 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.886083 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.985240 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b452361-1b41-4ca1-9ce5-352dd7390d36-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5b452361-1b41-4ca1-9ce5-352dd7390d36\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:06:58 crc kubenswrapper[4783]: I0131 09:06:58.985320 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b452361-1b41-4ca1-9ce5-352dd7390d36-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5b452361-1b41-4ca1-9ce5-352dd7390d36\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:06:59 crc kubenswrapper[4783]: I0131 09:06:59.086563 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b452361-1b41-4ca1-9ce5-352dd7390d36-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5b452361-1b41-4ca1-9ce5-352dd7390d36\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:06:59 crc kubenswrapper[4783]: I0131 09:06:59.086628 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b452361-1b41-4ca1-9ce5-352dd7390d36-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5b452361-1b41-4ca1-9ce5-352dd7390d36\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:06:59 crc kubenswrapper[4783]: I0131 09:06:59.086859 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b452361-1b41-4ca1-9ce5-352dd7390d36-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5b452361-1b41-4ca1-9ce5-352dd7390d36\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:06:59 crc kubenswrapper[4783]: I0131 09:06:59.105807 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b452361-1b41-4ca1-9ce5-352dd7390d36-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5b452361-1b41-4ca1-9ce5-352dd7390d36\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:06:59 crc kubenswrapper[4783]: I0131 09:06:59.203149 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:06:59 crc kubenswrapper[4783]: I0131 09:06:59.836265 4783 patch_prober.go:28] interesting pod/router-default-5444994796-wb9bj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:06:59 crc kubenswrapper[4783]: [-]has-synced failed: reason withheld Jan 31 09:06:59 crc kubenswrapper[4783]: [+]process-running ok Jan 31 09:06:59 crc kubenswrapper[4783]: healthz check failed Jan 31 09:06:59 crc kubenswrapper[4783]: I0131 09:06:59.836345 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wb9bj" podUID="9c24edcb-aeef-44a1-99b6-9e7904c41253" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:07:00 crc kubenswrapper[4783]: I0131 09:07:00.797142 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sm9jn" Jan 31 09:07:00 crc kubenswrapper[4783]: I0131 09:07:00.836275 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:07:00 crc kubenswrapper[4783]: I0131 09:07:00.840175 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-wb9bj" Jan 31 09:07:01 crc kubenswrapper[4783]: I0131 09:07:01.368251 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 31 09:07:01 crc kubenswrapper[4783]: I0131 09:07:01.368699 4783 generic.go:334] "Generic (PLEG): container finished" podID="76827fef-ee7e-476c-82d3-8c43754e04d9" containerID="66ba566561ed25035903173aa9f4d3e89a950675e699c7769b00a3c7e8ae258f" exitCode=0 Jan 31 09:07:01 crc kubenswrapper[4783]: I0131 09:07:01.369048 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlxlx" event={"ID":"76827fef-ee7e-476c-82d3-8c43754e04d9","Type":"ContainerDied","Data":"66ba566561ed25035903173aa9f4d3e89a950675e699c7769b00a3c7e8ae258f"} Jan 31 09:07:01 crc kubenswrapper[4783]: I0131 09:07:01.369099 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlxlx" event={"ID":"76827fef-ee7e-476c-82d3-8c43754e04d9","Type":"ContainerStarted","Data":"b92a8b7b48871bcdfc42734260a094972d8234591ba5cb2c2d6fe18d7e131058"} Jan 31 09:07:01 crc kubenswrapper[4783]: I0131 09:07:01.450926 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xwzlk"] Jan 31 09:07:01 crc kubenswrapper[4783]: I0131 09:07:01.531067 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zzqtt"] Jan 31 09:07:01 crc kubenswrapper[4783]: I0131 09:07:01.532282 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 31 09:07:01 crc kubenswrapper[4783]: W0131 09:07:01.560830 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6d2558ed_aac8_452b_a38f_948fbac1e1dd.slice/crio-7f031ec2364a64d0efa4e6d64a31afa10ecae22deddcc1caddea03af325c1cf8 WatchSource:0}: Error finding container 7f031ec2364a64d0efa4e6d64a31afa10ecae22deddcc1caddea03af325c1cf8: Status 404 returned error can't find the container with id 7f031ec2364a64d0efa4e6d64a31afa10ecae22deddcc1caddea03af325c1cf8 Jan 31 09:07:02 crc kubenswrapper[4783]: I0131 09:07:02.376473 4783 generic.go:334] "Generic (PLEG): container finished" podID="108dfc0b-86ff-45c1-8d9f-a879d585ddff" containerID="d204017e839b2f3304e0f312ab2298d72d587a5bfdb5d255e77cc8fcdcbabd0f" exitCode=0 Jan 31 09:07:02 crc kubenswrapper[4783]: I0131 09:07:02.376541 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzqtt" event={"ID":"108dfc0b-86ff-45c1-8d9f-a879d585ddff","Type":"ContainerDied","Data":"d204017e839b2f3304e0f312ab2298d72d587a5bfdb5d255e77cc8fcdcbabd0f"} Jan 31 09:07:02 crc kubenswrapper[4783]: I0131 09:07:02.376879 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzqtt" event={"ID":"108dfc0b-86ff-45c1-8d9f-a879d585ddff","Type":"ContainerStarted","Data":"1e0c861067154e0d5e2d04c3ccc3970c55807992ae7cec4e8d5c08217b4bf84b"} Jan 31 09:07:02 crc kubenswrapper[4783]: I0131 09:07:02.379443 4783 generic.go:334] "Generic (PLEG): container finished" podID="5b452361-1b41-4ca1-9ce5-352dd7390d36" containerID="386b2ea7f2ca8000100f96e373fa34466178e3b96c0f3620f7275dcc05cc60aa" exitCode=0 Jan 31 09:07:02 crc kubenswrapper[4783]: I0131 09:07:02.379507 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5b452361-1b41-4ca1-9ce5-352dd7390d36","Type":"ContainerDied","Data":"386b2ea7f2ca8000100f96e373fa34466178e3b96c0f3620f7275dcc05cc60aa"} Jan 31 09:07:02 crc kubenswrapper[4783]: I0131 09:07:02.379532 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5b452361-1b41-4ca1-9ce5-352dd7390d36","Type":"ContainerStarted","Data":"d19fed8d98521ed17ebf5e7df52581cba5286fa26a8b527c0b9f968b19d3b7cf"} Jan 31 09:07:02 crc kubenswrapper[4783]: I0131 09:07:02.382219 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6d2558ed-aac8-452b-a38f-948fbac1e1dd","Type":"ContainerStarted","Data":"ee3cf12b5d5802dff49c76583a8aa68eb0807f316e80d75e48674efb354e4142"} Jan 31 09:07:02 crc kubenswrapper[4783]: I0131 09:07:02.382257 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6d2558ed-aac8-452b-a38f-948fbac1e1dd","Type":"ContainerStarted","Data":"7f031ec2364a64d0efa4e6d64a31afa10ecae22deddcc1caddea03af325c1cf8"} Jan 31 09:07:02 crc kubenswrapper[4783]: I0131 09:07:02.384678 4783 generic.go:334] "Generic (PLEG): container finished" podID="c3584910-fcf1-4d1d-a15c-8f4bc1a77809" containerID="b746c3dfb12f2cf101fcbd435da413b115aa09af2b710a9c70e623e9e537efa1" exitCode=0 Jan 31 09:07:02 crc kubenswrapper[4783]: I0131 09:07:02.384720 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwzlk" event={"ID":"c3584910-fcf1-4d1d-a15c-8f4bc1a77809","Type":"ContainerDied","Data":"b746c3dfb12f2cf101fcbd435da413b115aa09af2b710a9c70e623e9e537efa1"} Jan 31 09:07:02 crc kubenswrapper[4783]: I0131 09:07:02.384736 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwzlk" event={"ID":"c3584910-fcf1-4d1d-a15c-8f4bc1a77809","Type":"ContainerStarted","Data":"2cf9600cb99cf0eefe910bb01f3838906d4691d607db4c48af3f08987a939019"} Jan 31 09:07:02 crc kubenswrapper[4783]: I0131 09:07:02.427826 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=5.42779377 podStartE2EDuration="5.42779377s" podCreationTimestamp="2026-01-31 09:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:07:02.425223363 +0000 UTC m=+133.093906851" watchObservedRunningTime="2026-01-31 09:07:02.42779377 +0000 UTC m=+133.096477237" Jan 31 09:07:03 crc kubenswrapper[4783]: I0131 09:07:03.402406 4783 generic.go:334] "Generic (PLEG): container finished" podID="6d2558ed-aac8-452b-a38f-948fbac1e1dd" containerID="ee3cf12b5d5802dff49c76583a8aa68eb0807f316e80d75e48674efb354e4142" exitCode=0 Jan 31 09:07:03 crc kubenswrapper[4783]: I0131 09:07:03.402898 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6d2558ed-aac8-452b-a38f-948fbac1e1dd","Type":"ContainerDied","Data":"ee3cf12b5d5802dff49c76583a8aa68eb0807f316e80d75e48674efb354e4142"} Jan 31 09:07:05 crc kubenswrapper[4783]: I0131 09:07:05.079897 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:07:06 crc kubenswrapper[4783]: I0131 09:07:06.380451 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:07:06 crc kubenswrapper[4783]: I0131 09:07:06.421707 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5b452361-1b41-4ca1-9ce5-352dd7390d36","Type":"ContainerDied","Data":"d19fed8d98521ed17ebf5e7df52581cba5286fa26a8b527c0b9f968b19d3b7cf"} Jan 31 09:07:06 crc kubenswrapper[4783]: I0131 09:07:06.421747 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d19fed8d98521ed17ebf5e7df52581cba5286fa26a8b527c0b9f968b19d3b7cf" Jan 31 09:07:06 crc kubenswrapper[4783]: I0131 09:07:06.421806 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:07:06 crc kubenswrapper[4783]: I0131 09:07:06.487097 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b452361-1b41-4ca1-9ce5-352dd7390d36-kubelet-dir\") pod \"5b452361-1b41-4ca1-9ce5-352dd7390d36\" (UID: \"5b452361-1b41-4ca1-9ce5-352dd7390d36\") " Jan 31 09:07:06 crc kubenswrapper[4783]: I0131 09:07:06.487390 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b452361-1b41-4ca1-9ce5-352dd7390d36-kube-api-access\") pod \"5b452361-1b41-4ca1-9ce5-352dd7390d36\" (UID: \"5b452361-1b41-4ca1-9ce5-352dd7390d36\") " Jan 31 09:07:06 crc kubenswrapper[4783]: I0131 09:07:06.487998 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b452361-1b41-4ca1-9ce5-352dd7390d36-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5b452361-1b41-4ca1-9ce5-352dd7390d36" (UID: "5b452361-1b41-4ca1-9ce5-352dd7390d36"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:07:06 crc kubenswrapper[4783]: I0131 09:07:06.492982 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b452361-1b41-4ca1-9ce5-352dd7390d36-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5b452361-1b41-4ca1-9ce5-352dd7390d36" (UID: "5b452361-1b41-4ca1-9ce5-352dd7390d36"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:07:06 crc kubenswrapper[4783]: I0131 09:07:06.588399 4783 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b452361-1b41-4ca1-9ce5-352dd7390d36-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:06 crc kubenswrapper[4783]: I0131 09:07:06.588432 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b452361-1b41-4ca1-9ce5-352dd7390d36-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:08 crc kubenswrapper[4783]: I0131 09:07:08.037526 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:07:08 crc kubenswrapper[4783]: I0131 09:07:08.040552 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:07:09 crc kubenswrapper[4783]: I0131 09:07:09.045453 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:07:09 crc kubenswrapper[4783]: I0131 09:07:09.135296 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d2558ed-aac8-452b-a38f-948fbac1e1dd-kubelet-dir\") pod \"6d2558ed-aac8-452b-a38f-948fbac1e1dd\" (UID: \"6d2558ed-aac8-452b-a38f-948fbac1e1dd\") " Jan 31 09:07:09 crc kubenswrapper[4783]: I0131 09:07:09.135355 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d2558ed-aac8-452b-a38f-948fbac1e1dd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6d2558ed-aac8-452b-a38f-948fbac1e1dd" (UID: "6d2558ed-aac8-452b-a38f-948fbac1e1dd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:07:09 crc kubenswrapper[4783]: I0131 09:07:09.135561 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d2558ed-aac8-452b-a38f-948fbac1e1dd-kube-api-access\") pod \"6d2558ed-aac8-452b-a38f-948fbac1e1dd\" (UID: \"6d2558ed-aac8-452b-a38f-948fbac1e1dd\") " Jan 31 09:07:09 crc kubenswrapper[4783]: I0131 09:07:09.135961 4783 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d2558ed-aac8-452b-a38f-948fbac1e1dd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:09 crc kubenswrapper[4783]: I0131 09:07:09.146821 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d2558ed-aac8-452b-a38f-948fbac1e1dd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6d2558ed-aac8-452b-a38f-948fbac1e1dd" (UID: "6d2558ed-aac8-452b-a38f-948fbac1e1dd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:07:09 crc kubenswrapper[4783]: I0131 09:07:09.237696 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d2558ed-aac8-452b-a38f-948fbac1e1dd-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:09 crc kubenswrapper[4783]: I0131 09:07:09.439265 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6d2558ed-aac8-452b-a38f-948fbac1e1dd","Type":"ContainerDied","Data":"7f031ec2364a64d0efa4e6d64a31afa10ecae22deddcc1caddea03af325c1cf8"} Jan 31 09:07:09 crc kubenswrapper[4783]: I0131 09:07:09.439297 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:07:09 crc kubenswrapper[4783]: I0131 09:07:09.439305 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f031ec2364a64d0efa4e6d64a31afa10ecae22deddcc1caddea03af325c1cf8" Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.098998 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.477005 4783 generic.go:334] "Generic (PLEG): container finished" podID="a0fca12d-3abf-4543-b5f7-205f4bd75149" containerID="67ba78abfa334d1feee4bc9ce704cb9df086f133b6e917c39546d094318bbafe" exitCode=0 Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.477060 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4h5q" event={"ID":"a0fca12d-3abf-4543-b5f7-205f4bd75149","Type":"ContainerDied","Data":"67ba78abfa334d1feee4bc9ce704cb9df086f133b6e917c39546d094318bbafe"} Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.481645 4783 generic.go:334] "Generic (PLEG): container finished" podID="e052fd63-7a83-423f-84f9-6591c58046ce" containerID="825cbb8ac020d86174918ab47e323c0825528db7f035c39ba6c201db9d65bd4b" exitCode=0 Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.481792 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzbx8" event={"ID":"e052fd63-7a83-423f-84f9-6591c58046ce","Type":"ContainerDied","Data":"825cbb8ac020d86174918ab47e323c0825528db7f035c39ba6c201db9d65bd4b"} Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.484856 4783 generic.go:334] "Generic (PLEG): container finished" podID="108dfc0b-86ff-45c1-8d9f-a879d585ddff" containerID="ee6413b350a1188d6f68ee9e4bbf683b6cffc2f3f4aa40ce79cbcd275db9f935" exitCode=0 Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.484939 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzqtt" event={"ID":"108dfc0b-86ff-45c1-8d9f-a879d585ddff","Type":"ContainerDied","Data":"ee6413b350a1188d6f68ee9e4bbf683b6cffc2f3f4aa40ce79cbcd275db9f935"} Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.486812 4783 generic.go:334] "Generic (PLEG): container finished" podID="58cd7170-ffdf-4cf8-9d2c-4e0251ada36e" containerID="e821ffdcc0ee0dcbcf184db0acbc57eeeb58c083c3ff161fd240dfbd71c2e274" exitCode=0 Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.486940 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxtnp" event={"ID":"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e","Type":"ContainerDied","Data":"e821ffdcc0ee0dcbcf184db0acbc57eeeb58c083c3ff161fd240dfbd71c2e274"} Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.489856 4783 generic.go:334] "Generic (PLEG): container finished" podID="cd47c3c5-ff77-4c21-b855-820a1aa46d05" containerID="bd705856fd45dd28df90249ed440f181a500f77b3af6f6ece8dddaf2124f6af5" exitCode=0 Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.489921 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pd4w8" event={"ID":"cd47c3c5-ff77-4c21-b855-820a1aa46d05","Type":"ContainerDied","Data":"bd705856fd45dd28df90249ed440f181a500f77b3af6f6ece8dddaf2124f6af5"} Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.492679 4783 generic.go:334] "Generic (PLEG): container finished" podID="c3584910-fcf1-4d1d-a15c-8f4bc1a77809" containerID="a7f3953ee649ffd12709f8350b2881d0ef649d8b61d4f19faf0c94ca3cd01253" exitCode=0 Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.492890 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwzlk" event={"ID":"c3584910-fcf1-4d1d-a15c-8f4bc1a77809","Type":"ContainerDied","Data":"a7f3953ee649ffd12709f8350b2881d0ef649d8b61d4f19faf0c94ca3cd01253"} Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.498133 4783 generic.go:334] "Generic (PLEG): container finished" podID="67f8d4b0-8393-4fbc-bb6d-f7d321645e9e" containerID="3f2b0714331effaade16b4994075e99113926a1c5fbcfb664f01b689a49d3c11" exitCode=0 Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.498227 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2whxk" event={"ID":"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e","Type":"ContainerDied","Data":"3f2b0714331effaade16b4994075e99113926a1c5fbcfb664f01b689a49d3c11"} Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.503744 4783 generic.go:334] "Generic (PLEG): container finished" podID="76827fef-ee7e-476c-82d3-8c43754e04d9" containerID="042e680f7759c696e6d5d39f7adf6b3f85206632fa094f57503c0e1fac746acb" exitCode=0 Jan 31 09:07:14 crc kubenswrapper[4783]: I0131 09:07:14.503789 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlxlx" event={"ID":"76827fef-ee7e-476c-82d3-8c43754e04d9","Type":"ContainerDied","Data":"042e680f7759c696e6d5d39f7adf6b3f85206632fa094f57503c0e1fac746acb"} Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.518593 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pd4w8" event={"ID":"cd47c3c5-ff77-4c21-b855-820a1aa46d05","Type":"ContainerStarted","Data":"f9ef7109f5d097bbd6477fe02f454f4582df1b4d04a13868fac6f3ecb295e824"} Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.521713 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwzlk" event={"ID":"c3584910-fcf1-4d1d-a15c-8f4bc1a77809","Type":"ContainerStarted","Data":"b66b38aff5a362b498c81015d729f5a915bddae3b6b9782393c45255a0eed839"} Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.524566 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2whxk" event={"ID":"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e","Type":"ContainerStarted","Data":"772da3b44ce02bfd6d368e6277bd14f3e0d178f1e6ad5fa25541fe5f3c873631"} Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.527034 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlxlx" event={"ID":"76827fef-ee7e-476c-82d3-8c43754e04d9","Type":"ContainerStarted","Data":"233723abf43d96177454fb092ec804b320e0e54cee8ca11e3f5cdbe9a4205a69"} Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.529691 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4h5q" event={"ID":"a0fca12d-3abf-4543-b5f7-205f4bd75149","Type":"ContainerStarted","Data":"0e67d147a8d893d6997fa01b19f3b8bf076ac990be50d060cb33ad37612bc2d9"} Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.531939 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzbx8" event={"ID":"e052fd63-7a83-423f-84f9-6591c58046ce","Type":"ContainerStarted","Data":"c908e7c7f624ff3a868aee6a94240ddaf5375248e1e94d272adf04a16cddb201"} Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.534598 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzqtt" event={"ID":"108dfc0b-86ff-45c1-8d9f-a879d585ddff","Type":"ContainerStarted","Data":"2ced1fefb79907b94a28218e799f02c08457fa9a73f9a38bc290d04b0b61692c"} Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.537470 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxtnp" event={"ID":"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e","Type":"ContainerStarted","Data":"92b612bb9113e4400c98300d96cc50325f00925d7f5a7e7ad73dadedc46f8f67"} Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.547567 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pd4w8" podStartSLOduration=2.775103535 podStartE2EDuration="22.547551147s" podCreationTimestamp="2026-01-31 09:06:53 +0000 UTC" firstStartedPulling="2026-01-31 09:06:55.29199762 +0000 UTC m=+125.960681088" lastFinishedPulling="2026-01-31 09:07:15.064445232 +0000 UTC m=+145.733128700" observedRunningTime="2026-01-31 09:07:15.545417005 +0000 UTC m=+146.214100463" watchObservedRunningTime="2026-01-31 09:07:15.547551147 +0000 UTC m=+146.216234616" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.567838 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zzqtt" podStartSLOduration=6.787535082 podStartE2EDuration="19.567811753s" podCreationTimestamp="2026-01-31 09:06:56 +0000 UTC" firstStartedPulling="2026-01-31 09:07:02.378038871 +0000 UTC m=+133.046722339" lastFinishedPulling="2026-01-31 09:07:15.158315543 +0000 UTC m=+145.826999010" observedRunningTime="2026-01-31 09:07:15.565376581 +0000 UTC m=+146.234060048" watchObservedRunningTime="2026-01-31 09:07:15.567811753 +0000 UTC m=+146.236495211" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.589811 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tzbx8" podStartSLOduration=2.908837596 podStartE2EDuration="22.589796526s" podCreationTimestamp="2026-01-31 09:06:53 +0000 UTC" firstStartedPulling="2026-01-31 09:06:55.283707956 +0000 UTC m=+125.952391424" lastFinishedPulling="2026-01-31 09:07:14.964666886 +0000 UTC m=+145.633350354" observedRunningTime="2026-01-31 09:07:15.586193798 +0000 UTC m=+146.254877266" watchObservedRunningTime="2026-01-31 09:07:15.589796526 +0000 UTC m=+146.258479994" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.604889 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xwzlk" podStartSLOduration=7.019689189 podStartE2EDuration="19.604877494s" podCreationTimestamp="2026-01-31 09:06:56 +0000 UTC" firstStartedPulling="2026-01-31 09:07:02.386105544 +0000 UTC m=+133.054789012" lastFinishedPulling="2026-01-31 09:07:14.971293849 +0000 UTC m=+145.639977317" observedRunningTime="2026-01-31 09:07:15.600543033 +0000 UTC m=+146.269226501" watchObservedRunningTime="2026-01-31 09:07:15.604877494 +0000 UTC m=+146.273560962" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.613136 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2whxk" podStartSLOduration=2.8559847229999997 podStartE2EDuration="22.613122965s" podCreationTimestamp="2026-01-31 09:06:53 +0000 UTC" firstStartedPulling="2026-01-31 09:06:55.282933002 +0000 UTC m=+125.951616469" lastFinishedPulling="2026-01-31 09:07:15.040071243 +0000 UTC m=+145.708754711" observedRunningTime="2026-01-31 09:07:15.612893341 +0000 UTC m=+146.281576838" watchObservedRunningTime="2026-01-31 09:07:15.613122965 +0000 UTC m=+146.281806422" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.638079 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s4h5q" podStartSLOduration=5.994210324 podStartE2EDuration="20.638060738s" podCreationTimestamp="2026-01-31 09:06:55 +0000 UTC" firstStartedPulling="2026-01-31 09:07:00.381201419 +0000 UTC m=+131.049884877" lastFinishedPulling="2026-01-31 09:07:15.025051822 +0000 UTC m=+145.693735291" observedRunningTime="2026-01-31 09:07:15.636437671 +0000 UTC m=+146.305121139" watchObservedRunningTime="2026-01-31 09:07:15.638060738 +0000 UTC m=+146.306744206" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.665932 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xxtnp" podStartSLOduration=1.821318918 podStartE2EDuration="22.665914532s" podCreationTimestamp="2026-01-31 09:06:53 +0000 UTC" firstStartedPulling="2026-01-31 09:06:54.266427864 +0000 UTC m=+124.935111332" lastFinishedPulling="2026-01-31 09:07:15.111023478 +0000 UTC m=+145.779706946" observedRunningTime="2026-01-31 09:07:15.664710898 +0000 UTC m=+146.333394366" watchObservedRunningTime="2026-01-31 09:07:15.665914532 +0000 UTC m=+146.334598000" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.666073 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wlxlx" podStartSLOduration=7.04876855 podStartE2EDuration="20.666067952s" podCreationTimestamp="2026-01-31 09:06:55 +0000 UTC" firstStartedPulling="2026-01-31 09:07:01.375504293 +0000 UTC m=+132.044187761" lastFinishedPulling="2026-01-31 09:07:14.992803694 +0000 UTC m=+145.661487163" observedRunningTime="2026-01-31 09:07:15.651951907 +0000 UTC m=+146.320635376" watchObservedRunningTime="2026-01-31 09:07:15.666067952 +0000 UTC m=+146.334751410" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.731037 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.731077 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.731129 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.731146 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.735244 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.735419 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.735553 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.742869 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.743853 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.748145 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.763971 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.764084 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.855398 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.855592 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.855715 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.866729 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:07:15 crc kubenswrapper[4783]: I0131 09:07:15.871773 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:07:16 crc kubenswrapper[4783]: I0131 09:07:16.262175 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:07:16 crc kubenswrapper[4783]: I0131 09:07:16.262520 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:07:16 crc kubenswrapper[4783]: W0131 09:07:16.301909 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-cd5fb27e33b73c7e7db761ca5d1ea5279e07e8592c9f6b3d5c32999b4b94d233 WatchSource:0}: Error finding container cd5fb27e33b73c7e7db761ca5d1ea5279e07e8592c9f6b3d5c32999b4b94d233: Status 404 returned error can't find the container with id cd5fb27e33b73c7e7db761ca5d1ea5279e07e8592c9f6b3d5c32999b4b94d233 Jan 31 09:07:16 crc kubenswrapper[4783]: W0131 09:07:16.431465 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-d595b44aadb7935fb52d005400af45ea88996bcdd39ca997d1d6b51862467272 WatchSource:0}: Error finding container d595b44aadb7935fb52d005400af45ea88996bcdd39ca997d1d6b51862467272: Status 404 returned error can't find the container with id d595b44aadb7935fb52d005400af45ea88996bcdd39ca997d1d6b51862467272 Jan 31 09:07:16 crc kubenswrapper[4783]: W0131 09:07:16.503594 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-ebc53aaf797ed86fc923f72ace1b986422beb0577d5dc0e5128658c98e74229b WatchSource:0}: Error finding container ebc53aaf797ed86fc923f72ace1b986422beb0577d5dc0e5128658c98e74229b: Status 404 returned error can't find the container with id ebc53aaf797ed86fc923f72ace1b986422beb0577d5dc0e5128658c98e74229b Jan 31 09:07:16 crc kubenswrapper[4783]: I0131 09:07:16.564358 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d595b44aadb7935fb52d005400af45ea88996bcdd39ca997d1d6b51862467272"} Jan 31 09:07:16 crc kubenswrapper[4783]: I0131 09:07:16.574941 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ebc53aaf797ed86fc923f72ace1b986422beb0577d5dc0e5128658c98e74229b"} Jan 31 09:07:16 crc kubenswrapper[4783]: I0131 09:07:16.596147 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6064deb6c201e5dad8140d471694a7281c87ea5dea0a1b80e8d8cbd95c1f89b5"} Jan 31 09:07:16 crc kubenswrapper[4783]: I0131 09:07:16.596455 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"cd5fb27e33b73c7e7db761ca5d1ea5279e07e8592c9f6b3d5c32999b4b94d233"} Jan 31 09:07:16 crc kubenswrapper[4783]: I0131 09:07:16.849370 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:07:16 crc kubenswrapper[4783]: I0131 09:07:16.849626 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:07:16 crc kubenswrapper[4783]: I0131 09:07:16.953549 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-s4h5q" podUID="a0fca12d-3abf-4543-b5f7-205f4bd75149" containerName="registry-server" probeResult="failure" output=< Jan 31 09:07:16 crc kubenswrapper[4783]: timeout: failed to connect service ":50051" within 1s Jan 31 09:07:16 crc kubenswrapper[4783]: > Jan 31 09:07:17 crc kubenswrapper[4783]: I0131 09:07:17.249628 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:07:17 crc kubenswrapper[4783]: I0131 09:07:17.250021 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:07:17 crc kubenswrapper[4783]: I0131 09:07:17.308266 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-wlxlx" podUID="76827fef-ee7e-476c-82d3-8c43754e04d9" containerName="registry-server" probeResult="failure" output=< Jan 31 09:07:17 crc kubenswrapper[4783]: timeout: failed to connect service ":50051" within 1s Jan 31 09:07:17 crc kubenswrapper[4783]: > Jan 31 09:07:17 crc kubenswrapper[4783]: I0131 09:07:17.603068 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6cd8f3ab6e2e02808e0c6b7238d0a49614f2528ce51c1b701cbd47b6de349bb1"} Jan 31 09:07:17 crc kubenswrapper[4783]: I0131 09:07:17.605526 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2e27940a36273cf1cf112924769917b98e74940f5f031a68ca8ad39ae0a7ff8f"} Jan 31 09:07:17 crc kubenswrapper[4783]: I0131 09:07:17.757051 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:07:17 crc kubenswrapper[4783]: I0131 09:07:17.757389 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:07:17 crc kubenswrapper[4783]: I0131 09:07:17.893877 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zzqtt" podUID="108dfc0b-86ff-45c1-8d9f-a879d585ddff" containerName="registry-server" probeResult="failure" output=< Jan 31 09:07:17 crc kubenswrapper[4783]: timeout: failed to connect service ":50051" within 1s Jan 31 09:07:17 crc kubenswrapper[4783]: > Jan 31 09:07:18 crc kubenswrapper[4783]: I0131 09:07:18.285962 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xwzlk" podUID="c3584910-fcf1-4d1d-a15c-8f4bc1a77809" containerName="registry-server" probeResult="failure" output=< Jan 31 09:07:18 crc kubenswrapper[4783]: timeout: failed to connect service ":50051" within 1s Jan 31 09:07:18 crc kubenswrapper[4783]: > Jan 31 09:07:18 crc kubenswrapper[4783]: I0131 09:07:18.615754 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:07:23 crc kubenswrapper[4783]: I0131 09:07:23.660289 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:07:23 crc kubenswrapper[4783]: I0131 09:07:23.660922 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:07:23 crc kubenswrapper[4783]: I0131 09:07:23.693352 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:07:23 crc kubenswrapper[4783]: I0131 09:07:23.843291 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7csp2"] Jan 31 09:07:23 crc kubenswrapper[4783]: I0131 09:07:23.868170 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:07:23 crc kubenswrapper[4783]: I0131 09:07:23.868221 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:07:23 crc kubenswrapper[4783]: I0131 09:07:23.902297 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:07:24 crc kubenswrapper[4783]: I0131 09:07:24.072390 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:07:24 crc kubenswrapper[4783]: I0131 09:07:24.072840 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:07:24 crc kubenswrapper[4783]: I0131 09:07:24.099505 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:07:24 crc kubenswrapper[4783]: I0131 09:07:24.269181 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:07:24 crc kubenswrapper[4783]: I0131 09:07:24.269230 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:07:24 crc kubenswrapper[4783]: I0131 09:07:24.298847 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:07:24 crc kubenswrapper[4783]: I0131 09:07:24.695797 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:07:24 crc kubenswrapper[4783]: I0131 09:07:24.697671 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:07:24 crc kubenswrapper[4783]: I0131 09:07:24.698791 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:07:24 crc kubenswrapper[4783]: I0131 09:07:24.701001 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:07:25 crc kubenswrapper[4783]: I0131 09:07:25.722433 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pd4w8"] Jan 31 09:07:25 crc kubenswrapper[4783]: I0131 09:07:25.895141 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:07:25 crc kubenswrapper[4783]: I0131 09:07:25.922400 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:07:26 crc kubenswrapper[4783]: I0131 09:07:26.291388 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:07:26 crc kubenswrapper[4783]: I0131 09:07:26.320904 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tzbx8"] Jan 31 09:07:26 crc kubenswrapper[4783]: I0131 09:07:26.323201 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:07:26 crc kubenswrapper[4783]: I0131 09:07:26.668318 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pd4w8" podUID="cd47c3c5-ff77-4c21-b855-820a1aa46d05" containerName="registry-server" containerID="cri-o://f9ef7109f5d097bbd6477fe02f454f4582df1b4d04a13868fac6f3ecb295e824" gracePeriod=2 Jan 31 09:07:26 crc kubenswrapper[4783]: I0131 09:07:26.885442 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:07:26 crc kubenswrapper[4783]: I0131 09:07:26.923585 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.129905 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.198947 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd47c3c5-ff77-4c21-b855-820a1aa46d05-catalog-content\") pod \"cd47c3c5-ff77-4c21-b855-820a1aa46d05\" (UID: \"cd47c3c5-ff77-4c21-b855-820a1aa46d05\") " Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.199049 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bttnl\" (UniqueName: \"kubernetes.io/projected/cd47c3c5-ff77-4c21-b855-820a1aa46d05-kube-api-access-bttnl\") pod \"cd47c3c5-ff77-4c21-b855-820a1aa46d05\" (UID: \"cd47c3c5-ff77-4c21-b855-820a1aa46d05\") " Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.199082 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd47c3c5-ff77-4c21-b855-820a1aa46d05-utilities\") pod \"cd47c3c5-ff77-4c21-b855-820a1aa46d05\" (UID: \"cd47c3c5-ff77-4c21-b855-820a1aa46d05\") " Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.200083 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd47c3c5-ff77-4c21-b855-820a1aa46d05-utilities" (OuterVolumeSpecName: "utilities") pod "cd47c3c5-ff77-4c21-b855-820a1aa46d05" (UID: "cd47c3c5-ff77-4c21-b855-820a1aa46d05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.204178 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd47c3c5-ff77-4c21-b855-820a1aa46d05-kube-api-access-bttnl" (OuterVolumeSpecName: "kube-api-access-bttnl") pod "cd47c3c5-ff77-4c21-b855-820a1aa46d05" (UID: "cd47c3c5-ff77-4c21-b855-820a1aa46d05"). InnerVolumeSpecName "kube-api-access-bttnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.231319 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd47c3c5-ff77-4c21-b855-820a1aa46d05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd47c3c5-ff77-4c21-b855-820a1aa46d05" (UID: "cd47c3c5-ff77-4c21-b855-820a1aa46d05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.283379 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.300774 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd47c3c5-ff77-4c21-b855-820a1aa46d05-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.300883 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bttnl\" (UniqueName: \"kubernetes.io/projected/cd47c3c5-ff77-4c21-b855-820a1aa46d05-kube-api-access-bttnl\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.300963 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd47c3c5-ff77-4c21-b855-820a1aa46d05-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.313201 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.676595 4783 generic.go:334] "Generic (PLEG): container finished" podID="cd47c3c5-ff77-4c21-b855-820a1aa46d05" containerID="f9ef7109f5d097bbd6477fe02f454f4582df1b4d04a13868fac6f3ecb295e824" exitCode=0 Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.677357 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pd4w8" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.677730 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pd4w8" event={"ID":"cd47c3c5-ff77-4c21-b855-820a1aa46d05","Type":"ContainerDied","Data":"f9ef7109f5d097bbd6477fe02f454f4582df1b4d04a13868fac6f3ecb295e824"} Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.677768 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pd4w8" event={"ID":"cd47c3c5-ff77-4c21-b855-820a1aa46d05","Type":"ContainerDied","Data":"17667b8f68e533857087dd61ea7492ec9ef2a768ce24fb1ca3ac1c6872dbc4f5"} Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.677792 4783 scope.go:117] "RemoveContainer" containerID="f9ef7109f5d097bbd6477fe02f454f4582df1b4d04a13868fac6f3ecb295e824" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.678391 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tzbx8" podUID="e052fd63-7a83-423f-84f9-6591c58046ce" containerName="registry-server" containerID="cri-o://c908e7c7f624ff3a868aee6a94240ddaf5375248e1e94d272adf04a16cddb201" gracePeriod=2 Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.700021 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pd4w8"] Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.705539 4783 scope.go:117] "RemoveContainer" containerID="bd705856fd45dd28df90249ed440f181a500f77b3af6f6ece8dddaf2124f6af5" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.705730 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pd4w8"] Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.728853 4783 scope.go:117] "RemoveContainer" containerID="300432cd73bcf07b59f726e1af2efa4512141633333326b469d9bda18eaf8aee" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.811524 4783 scope.go:117] "RemoveContainer" containerID="f9ef7109f5d097bbd6477fe02f454f4582df1b4d04a13868fac6f3ecb295e824" Jan 31 09:07:27 crc kubenswrapper[4783]: E0131 09:07:27.818711 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9ef7109f5d097bbd6477fe02f454f4582df1b4d04a13868fac6f3ecb295e824\": container with ID starting with f9ef7109f5d097bbd6477fe02f454f4582df1b4d04a13868fac6f3ecb295e824 not found: ID does not exist" containerID="f9ef7109f5d097bbd6477fe02f454f4582df1b4d04a13868fac6f3ecb295e824" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.819950 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ef7109f5d097bbd6477fe02f454f4582df1b4d04a13868fac6f3ecb295e824"} err="failed to get container status \"f9ef7109f5d097bbd6477fe02f454f4582df1b4d04a13868fac6f3ecb295e824\": rpc error: code = NotFound desc = could not find container \"f9ef7109f5d097bbd6477fe02f454f4582df1b4d04a13868fac6f3ecb295e824\": container with ID starting with f9ef7109f5d097bbd6477fe02f454f4582df1b4d04a13868fac6f3ecb295e824 not found: ID does not exist" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.820038 4783 scope.go:117] "RemoveContainer" containerID="bd705856fd45dd28df90249ed440f181a500f77b3af6f6ece8dddaf2124f6af5" Jan 31 09:07:27 crc kubenswrapper[4783]: E0131 09:07:27.821740 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd705856fd45dd28df90249ed440f181a500f77b3af6f6ece8dddaf2124f6af5\": container with ID starting with bd705856fd45dd28df90249ed440f181a500f77b3af6f6ece8dddaf2124f6af5 not found: ID does not exist" containerID="bd705856fd45dd28df90249ed440f181a500f77b3af6f6ece8dddaf2124f6af5" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.821862 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd705856fd45dd28df90249ed440f181a500f77b3af6f6ece8dddaf2124f6af5"} err="failed to get container status \"bd705856fd45dd28df90249ed440f181a500f77b3af6f6ece8dddaf2124f6af5\": rpc error: code = NotFound desc = could not find container \"bd705856fd45dd28df90249ed440f181a500f77b3af6f6ece8dddaf2124f6af5\": container with ID starting with bd705856fd45dd28df90249ed440f181a500f77b3af6f6ece8dddaf2124f6af5 not found: ID does not exist" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.821935 4783 scope.go:117] "RemoveContainer" containerID="300432cd73bcf07b59f726e1af2efa4512141633333326b469d9bda18eaf8aee" Jan 31 09:07:27 crc kubenswrapper[4783]: E0131 09:07:27.825524 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"300432cd73bcf07b59f726e1af2efa4512141633333326b469d9bda18eaf8aee\": container with ID starting with 300432cd73bcf07b59f726e1af2efa4512141633333326b469d9bda18eaf8aee not found: ID does not exist" containerID="300432cd73bcf07b59f726e1af2efa4512141633333326b469d9bda18eaf8aee" Jan 31 09:07:27 crc kubenswrapper[4783]: I0131 09:07:27.825559 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"300432cd73bcf07b59f726e1af2efa4512141633333326b469d9bda18eaf8aee"} err="failed to get container status \"300432cd73bcf07b59f726e1af2efa4512141633333326b469d9bda18eaf8aee\": rpc error: code = NotFound desc = could not find container \"300432cd73bcf07b59f726e1af2efa4512141633333326b469d9bda18eaf8aee\": container with ID starting with 300432cd73bcf07b59f726e1af2efa4512141633333326b469d9bda18eaf8aee not found: ID does not exist" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.083280 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.214125 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e052fd63-7a83-423f-84f9-6591c58046ce-catalog-content\") pod \"e052fd63-7a83-423f-84f9-6591c58046ce\" (UID: \"e052fd63-7a83-423f-84f9-6591c58046ce\") " Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.214509 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk4hz\" (UniqueName: \"kubernetes.io/projected/e052fd63-7a83-423f-84f9-6591c58046ce-kube-api-access-zk4hz\") pod \"e052fd63-7a83-423f-84f9-6591c58046ce\" (UID: \"e052fd63-7a83-423f-84f9-6591c58046ce\") " Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.214627 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e052fd63-7a83-423f-84f9-6591c58046ce-utilities\") pod \"e052fd63-7a83-423f-84f9-6591c58046ce\" (UID: \"e052fd63-7a83-423f-84f9-6591c58046ce\") " Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.215199 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e052fd63-7a83-423f-84f9-6591c58046ce-utilities" (OuterVolumeSpecName: "utilities") pod "e052fd63-7a83-423f-84f9-6591c58046ce" (UID: "e052fd63-7a83-423f-84f9-6591c58046ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.217388 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e052fd63-7a83-423f-84f9-6591c58046ce-kube-api-access-zk4hz" (OuterVolumeSpecName: "kube-api-access-zk4hz") pod "e052fd63-7a83-423f-84f9-6591c58046ce" (UID: "e052fd63-7a83-423f-84f9-6591c58046ce"). InnerVolumeSpecName "kube-api-access-zk4hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.255355 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e052fd63-7a83-423f-84f9-6591c58046ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e052fd63-7a83-423f-84f9-6591c58046ce" (UID: "e052fd63-7a83-423f-84f9-6591c58046ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.316383 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk4hz\" (UniqueName: \"kubernetes.io/projected/e052fd63-7a83-423f-84f9-6591c58046ce-kube-api-access-zk4hz\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.316414 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e052fd63-7a83-423f-84f9-6591c58046ce-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.316425 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e052fd63-7a83-423f-84f9-6591c58046ce-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.685922 4783 generic.go:334] "Generic (PLEG): container finished" podID="e052fd63-7a83-423f-84f9-6591c58046ce" containerID="c908e7c7f624ff3a868aee6a94240ddaf5375248e1e94d272adf04a16cddb201" exitCode=0 Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.685963 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzbx8" event={"ID":"e052fd63-7a83-423f-84f9-6591c58046ce","Type":"ContainerDied","Data":"c908e7c7f624ff3a868aee6a94240ddaf5375248e1e94d272adf04a16cddb201"} Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.685987 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzbx8" event={"ID":"e052fd63-7a83-423f-84f9-6591c58046ce","Type":"ContainerDied","Data":"8b096c0a81bb9c716a87edb3d254995627dd3a1814cd01ecbfee923183d20af7"} Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.686000 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tzbx8" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.686006 4783 scope.go:117] "RemoveContainer" containerID="c908e7c7f624ff3a868aee6a94240ddaf5375248e1e94d272adf04a16cddb201" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.701932 4783 scope.go:117] "RemoveContainer" containerID="825cbb8ac020d86174918ab47e323c0825528db7f035c39ba6c201db9d65bd4b" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.708756 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tzbx8"] Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.711239 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tzbx8"] Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.721134 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlxlx"] Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.721626 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wlxlx" podUID="76827fef-ee7e-476c-82d3-8c43754e04d9" containerName="registry-server" containerID="cri-o://233723abf43d96177454fb092ec804b320e0e54cee8ca11e3f5cdbe9a4205a69" gracePeriod=2 Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.729660 4783 scope.go:117] "RemoveContainer" containerID="f51f263d104fcdf96f9282abb5f262b97f8ac13bf2bfa50d60cbbfa437072247" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.743157 4783 scope.go:117] "RemoveContainer" containerID="c908e7c7f624ff3a868aee6a94240ddaf5375248e1e94d272adf04a16cddb201" Jan 31 09:07:28 crc kubenswrapper[4783]: E0131 09:07:28.743500 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c908e7c7f624ff3a868aee6a94240ddaf5375248e1e94d272adf04a16cddb201\": container with ID starting with c908e7c7f624ff3a868aee6a94240ddaf5375248e1e94d272adf04a16cddb201 not found: ID does not exist" containerID="c908e7c7f624ff3a868aee6a94240ddaf5375248e1e94d272adf04a16cddb201" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.743530 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c908e7c7f624ff3a868aee6a94240ddaf5375248e1e94d272adf04a16cddb201"} err="failed to get container status \"c908e7c7f624ff3a868aee6a94240ddaf5375248e1e94d272adf04a16cddb201\": rpc error: code = NotFound desc = could not find container \"c908e7c7f624ff3a868aee6a94240ddaf5375248e1e94d272adf04a16cddb201\": container with ID starting with c908e7c7f624ff3a868aee6a94240ddaf5375248e1e94d272adf04a16cddb201 not found: ID does not exist" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.743553 4783 scope.go:117] "RemoveContainer" containerID="825cbb8ac020d86174918ab47e323c0825528db7f035c39ba6c201db9d65bd4b" Jan 31 09:07:28 crc kubenswrapper[4783]: E0131 09:07:28.743850 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"825cbb8ac020d86174918ab47e323c0825528db7f035c39ba6c201db9d65bd4b\": container with ID starting with 825cbb8ac020d86174918ab47e323c0825528db7f035c39ba6c201db9d65bd4b not found: ID does not exist" containerID="825cbb8ac020d86174918ab47e323c0825528db7f035c39ba6c201db9d65bd4b" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.743870 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"825cbb8ac020d86174918ab47e323c0825528db7f035c39ba6c201db9d65bd4b"} err="failed to get container status \"825cbb8ac020d86174918ab47e323c0825528db7f035c39ba6c201db9d65bd4b\": rpc error: code = NotFound desc = could not find container \"825cbb8ac020d86174918ab47e323c0825528db7f035c39ba6c201db9d65bd4b\": container with ID starting with 825cbb8ac020d86174918ab47e323c0825528db7f035c39ba6c201db9d65bd4b not found: ID does not exist" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.743890 4783 scope.go:117] "RemoveContainer" containerID="f51f263d104fcdf96f9282abb5f262b97f8ac13bf2bfa50d60cbbfa437072247" Jan 31 09:07:28 crc kubenswrapper[4783]: E0131 09:07:28.744537 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f51f263d104fcdf96f9282abb5f262b97f8ac13bf2bfa50d60cbbfa437072247\": container with ID starting with f51f263d104fcdf96f9282abb5f262b97f8ac13bf2bfa50d60cbbfa437072247 not found: ID does not exist" containerID="f51f263d104fcdf96f9282abb5f262b97f8ac13bf2bfa50d60cbbfa437072247" Jan 31 09:07:28 crc kubenswrapper[4783]: I0131 09:07:28.744587 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f51f263d104fcdf96f9282abb5f262b97f8ac13bf2bfa50d60cbbfa437072247"} err="failed to get container status \"f51f263d104fcdf96f9282abb5f262b97f8ac13bf2bfa50d60cbbfa437072247\": rpc error: code = NotFound desc = could not find container \"f51f263d104fcdf96f9282abb5f262b97f8ac13bf2bfa50d60cbbfa437072247\": container with ID starting with f51f263d104fcdf96f9282abb5f262b97f8ac13bf2bfa50d60cbbfa437072247 not found: ID does not exist" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.073594 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hcbdl" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.136194 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.226226 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76827fef-ee7e-476c-82d3-8c43754e04d9-utilities\") pod \"76827fef-ee7e-476c-82d3-8c43754e04d9\" (UID: \"76827fef-ee7e-476c-82d3-8c43754e04d9\") " Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.226404 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76827fef-ee7e-476c-82d3-8c43754e04d9-catalog-content\") pod \"76827fef-ee7e-476c-82d3-8c43754e04d9\" (UID: \"76827fef-ee7e-476c-82d3-8c43754e04d9\") " Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.226437 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ct72\" (UniqueName: \"kubernetes.io/projected/76827fef-ee7e-476c-82d3-8c43754e04d9-kube-api-access-4ct72\") pod \"76827fef-ee7e-476c-82d3-8c43754e04d9\" (UID: \"76827fef-ee7e-476c-82d3-8c43754e04d9\") " Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.227010 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76827fef-ee7e-476c-82d3-8c43754e04d9-utilities" (OuterVolumeSpecName: "utilities") pod "76827fef-ee7e-476c-82d3-8c43754e04d9" (UID: "76827fef-ee7e-476c-82d3-8c43754e04d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.230410 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76827fef-ee7e-476c-82d3-8c43754e04d9-kube-api-access-4ct72" (OuterVolumeSpecName: "kube-api-access-4ct72") pod "76827fef-ee7e-476c-82d3-8c43754e04d9" (UID: "76827fef-ee7e-476c-82d3-8c43754e04d9"). InnerVolumeSpecName "kube-api-access-4ct72". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.244057 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76827fef-ee7e-476c-82d3-8c43754e04d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76827fef-ee7e-476c-82d3-8c43754e04d9" (UID: "76827fef-ee7e-476c-82d3-8c43754e04d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.328135 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76827fef-ee7e-476c-82d3-8c43754e04d9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.328190 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ct72\" (UniqueName: \"kubernetes.io/projected/76827fef-ee7e-476c-82d3-8c43754e04d9-kube-api-access-4ct72\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.328211 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76827fef-ee7e-476c-82d3-8c43754e04d9-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.656929 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd47c3c5-ff77-4c21-b855-820a1aa46d05" path="/var/lib/kubelet/pods/cd47c3c5-ff77-4c21-b855-820a1aa46d05/volumes" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.657795 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e052fd63-7a83-423f-84f9-6591c58046ce" path="/var/lib/kubelet/pods/e052fd63-7a83-423f-84f9-6591c58046ce/volumes" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.697572 4783 generic.go:334] "Generic (PLEG): container finished" podID="76827fef-ee7e-476c-82d3-8c43754e04d9" containerID="233723abf43d96177454fb092ec804b320e0e54cee8ca11e3f5cdbe9a4205a69" exitCode=0 Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.697673 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlxlx" event={"ID":"76827fef-ee7e-476c-82d3-8c43754e04d9","Type":"ContainerDied","Data":"233723abf43d96177454fb092ec804b320e0e54cee8ca11e3f5cdbe9a4205a69"} Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.697736 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlxlx" event={"ID":"76827fef-ee7e-476c-82d3-8c43754e04d9","Type":"ContainerDied","Data":"b92a8b7b48871bcdfc42734260a094972d8234591ba5cb2c2d6fe18d7e131058"} Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.697758 4783 scope.go:117] "RemoveContainer" containerID="233723abf43d96177454fb092ec804b320e0e54cee8ca11e3f5cdbe9a4205a69" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.698359 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlxlx" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.719734 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlxlx"] Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.722258 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlxlx"] Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.722307 4783 scope.go:117] "RemoveContainer" containerID="042e680f7759c696e6d5d39f7adf6b3f85206632fa094f57503c0e1fac746acb" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.742631 4783 scope.go:117] "RemoveContainer" containerID="66ba566561ed25035903173aa9f4d3e89a950675e699c7769b00a3c7e8ae258f" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.755086 4783 scope.go:117] "RemoveContainer" containerID="233723abf43d96177454fb092ec804b320e0e54cee8ca11e3f5cdbe9a4205a69" Jan 31 09:07:29 crc kubenswrapper[4783]: E0131 09:07:29.755422 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"233723abf43d96177454fb092ec804b320e0e54cee8ca11e3f5cdbe9a4205a69\": container with ID starting with 233723abf43d96177454fb092ec804b320e0e54cee8ca11e3f5cdbe9a4205a69 not found: ID does not exist" containerID="233723abf43d96177454fb092ec804b320e0e54cee8ca11e3f5cdbe9a4205a69" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.755448 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"233723abf43d96177454fb092ec804b320e0e54cee8ca11e3f5cdbe9a4205a69"} err="failed to get container status \"233723abf43d96177454fb092ec804b320e0e54cee8ca11e3f5cdbe9a4205a69\": rpc error: code = NotFound desc = could not find container \"233723abf43d96177454fb092ec804b320e0e54cee8ca11e3f5cdbe9a4205a69\": container with ID starting with 233723abf43d96177454fb092ec804b320e0e54cee8ca11e3f5cdbe9a4205a69 not found: ID does not exist" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.755471 4783 scope.go:117] "RemoveContainer" containerID="042e680f7759c696e6d5d39f7adf6b3f85206632fa094f57503c0e1fac746acb" Jan 31 09:07:29 crc kubenswrapper[4783]: E0131 09:07:29.755772 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"042e680f7759c696e6d5d39f7adf6b3f85206632fa094f57503c0e1fac746acb\": container with ID starting with 042e680f7759c696e6d5d39f7adf6b3f85206632fa094f57503c0e1fac746acb not found: ID does not exist" containerID="042e680f7759c696e6d5d39f7adf6b3f85206632fa094f57503c0e1fac746acb" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.755835 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"042e680f7759c696e6d5d39f7adf6b3f85206632fa094f57503c0e1fac746acb"} err="failed to get container status \"042e680f7759c696e6d5d39f7adf6b3f85206632fa094f57503c0e1fac746acb\": rpc error: code = NotFound desc = could not find container \"042e680f7759c696e6d5d39f7adf6b3f85206632fa094f57503c0e1fac746acb\": container with ID starting with 042e680f7759c696e6d5d39f7adf6b3f85206632fa094f57503c0e1fac746acb not found: ID does not exist" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.755883 4783 scope.go:117] "RemoveContainer" containerID="66ba566561ed25035903173aa9f4d3e89a950675e699c7769b00a3c7e8ae258f" Jan 31 09:07:29 crc kubenswrapper[4783]: E0131 09:07:29.756395 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66ba566561ed25035903173aa9f4d3e89a950675e699c7769b00a3c7e8ae258f\": container with ID starting with 66ba566561ed25035903173aa9f4d3e89a950675e699c7769b00a3c7e8ae258f not found: ID does not exist" containerID="66ba566561ed25035903173aa9f4d3e89a950675e699c7769b00a3c7e8ae258f" Jan 31 09:07:29 crc kubenswrapper[4783]: I0131 09:07:29.756496 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66ba566561ed25035903173aa9f4d3e89a950675e699c7769b00a3c7e8ae258f"} err="failed to get container status \"66ba566561ed25035903173aa9f4d3e89a950675e699c7769b00a3c7e8ae258f\": rpc error: code = NotFound desc = could not find container \"66ba566561ed25035903173aa9f4d3e89a950675e699c7769b00a3c7e8ae258f\": container with ID starting with 66ba566561ed25035903173aa9f4d3e89a950675e699c7769b00a3c7e8ae258f not found: ID does not exist" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.123197 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xwzlk"] Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.123483 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xwzlk" podUID="c3584910-fcf1-4d1d-a15c-8f4bc1a77809" containerName="registry-server" containerID="cri-o://b66b38aff5a362b498c81015d729f5a915bddae3b6b9782393c45255a0eed839" gracePeriod=2 Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.548967 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.653448 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76827fef-ee7e-476c-82d3-8c43754e04d9" path="/var/lib/kubelet/pods/76827fef-ee7e-476c-82d3-8c43754e04d9/volumes" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.657849 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-catalog-content\") pod \"c3584910-fcf1-4d1d-a15c-8f4bc1a77809\" (UID: \"c3584910-fcf1-4d1d-a15c-8f4bc1a77809\") " Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.657918 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n777v\" (UniqueName: \"kubernetes.io/projected/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-kube-api-access-n777v\") pod \"c3584910-fcf1-4d1d-a15c-8f4bc1a77809\" (UID: \"c3584910-fcf1-4d1d-a15c-8f4bc1a77809\") " Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.657982 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-utilities\") pod \"c3584910-fcf1-4d1d-a15c-8f4bc1a77809\" (UID: \"c3584910-fcf1-4d1d-a15c-8f4bc1a77809\") " Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.658767 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-utilities" (OuterVolumeSpecName: "utilities") pod "c3584910-fcf1-4d1d-a15c-8f4bc1a77809" (UID: "c3584910-fcf1-4d1d-a15c-8f4bc1a77809"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.663003 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-kube-api-access-n777v" (OuterVolumeSpecName: "kube-api-access-n777v") pod "c3584910-fcf1-4d1d-a15c-8f4bc1a77809" (UID: "c3584910-fcf1-4d1d-a15c-8f4bc1a77809"). InnerVolumeSpecName "kube-api-access-n777v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.724504 4783 generic.go:334] "Generic (PLEG): container finished" podID="c3584910-fcf1-4d1d-a15c-8f4bc1a77809" containerID="b66b38aff5a362b498c81015d729f5a915bddae3b6b9782393c45255a0eed839" exitCode=0 Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.724548 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwzlk" event={"ID":"c3584910-fcf1-4d1d-a15c-8f4bc1a77809","Type":"ContainerDied","Data":"b66b38aff5a362b498c81015d729f5a915bddae3b6b9782393c45255a0eed839"} Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.724597 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwzlk" event={"ID":"c3584910-fcf1-4d1d-a15c-8f4bc1a77809","Type":"ContainerDied","Data":"2cf9600cb99cf0eefe910bb01f3838906d4691d607db4c48af3f08987a939019"} Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.724606 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwzlk" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.724620 4783 scope.go:117] "RemoveContainer" containerID="b66b38aff5a362b498c81015d729f5a915bddae3b6b9782393c45255a0eed839" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.746044 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3584910-fcf1-4d1d-a15c-8f4bc1a77809" (UID: "c3584910-fcf1-4d1d-a15c-8f4bc1a77809"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.750683 4783 scope.go:117] "RemoveContainer" containerID="a7f3953ee649ffd12709f8350b2881d0ef649d8b61d4f19faf0c94ca3cd01253" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.760043 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.760070 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n777v\" (UniqueName: \"kubernetes.io/projected/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-kube-api-access-n777v\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.760086 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3584910-fcf1-4d1d-a15c-8f4bc1a77809-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.776501 4783 scope.go:117] "RemoveContainer" containerID="b746c3dfb12f2cf101fcbd435da413b115aa09af2b710a9c70e623e9e537efa1" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.787735 4783 scope.go:117] "RemoveContainer" containerID="b66b38aff5a362b498c81015d729f5a915bddae3b6b9782393c45255a0eed839" Jan 31 09:07:31 crc kubenswrapper[4783]: E0131 09:07:31.788269 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b66b38aff5a362b498c81015d729f5a915bddae3b6b9782393c45255a0eed839\": container with ID starting with b66b38aff5a362b498c81015d729f5a915bddae3b6b9782393c45255a0eed839 not found: ID does not exist" containerID="b66b38aff5a362b498c81015d729f5a915bddae3b6b9782393c45255a0eed839" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.788323 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b66b38aff5a362b498c81015d729f5a915bddae3b6b9782393c45255a0eed839"} err="failed to get container status \"b66b38aff5a362b498c81015d729f5a915bddae3b6b9782393c45255a0eed839\": rpc error: code = NotFound desc = could not find container \"b66b38aff5a362b498c81015d729f5a915bddae3b6b9782393c45255a0eed839\": container with ID starting with b66b38aff5a362b498c81015d729f5a915bddae3b6b9782393c45255a0eed839 not found: ID does not exist" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.788355 4783 scope.go:117] "RemoveContainer" containerID="a7f3953ee649ffd12709f8350b2881d0ef649d8b61d4f19faf0c94ca3cd01253" Jan 31 09:07:31 crc kubenswrapper[4783]: E0131 09:07:31.788687 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7f3953ee649ffd12709f8350b2881d0ef649d8b61d4f19faf0c94ca3cd01253\": container with ID starting with a7f3953ee649ffd12709f8350b2881d0ef649d8b61d4f19faf0c94ca3cd01253 not found: ID does not exist" containerID="a7f3953ee649ffd12709f8350b2881d0ef649d8b61d4f19faf0c94ca3cd01253" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.788714 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7f3953ee649ffd12709f8350b2881d0ef649d8b61d4f19faf0c94ca3cd01253"} err="failed to get container status \"a7f3953ee649ffd12709f8350b2881d0ef649d8b61d4f19faf0c94ca3cd01253\": rpc error: code = NotFound desc = could not find container \"a7f3953ee649ffd12709f8350b2881d0ef649d8b61d4f19faf0c94ca3cd01253\": container with ID starting with a7f3953ee649ffd12709f8350b2881d0ef649d8b61d4f19faf0c94ca3cd01253 not found: ID does not exist" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.788729 4783 scope.go:117] "RemoveContainer" containerID="b746c3dfb12f2cf101fcbd435da413b115aa09af2b710a9c70e623e9e537efa1" Jan 31 09:07:31 crc kubenswrapper[4783]: E0131 09:07:31.788960 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b746c3dfb12f2cf101fcbd435da413b115aa09af2b710a9c70e623e9e537efa1\": container with ID starting with b746c3dfb12f2cf101fcbd435da413b115aa09af2b710a9c70e623e9e537efa1 not found: ID does not exist" containerID="b746c3dfb12f2cf101fcbd435da413b115aa09af2b710a9c70e623e9e537efa1" Jan 31 09:07:31 crc kubenswrapper[4783]: I0131 09:07:31.788986 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b746c3dfb12f2cf101fcbd435da413b115aa09af2b710a9c70e623e9e537efa1"} err="failed to get container status \"b746c3dfb12f2cf101fcbd435da413b115aa09af2b710a9c70e623e9e537efa1\": rpc error: code = NotFound desc = could not find container \"b746c3dfb12f2cf101fcbd435da413b115aa09af2b710a9c70e623e9e537efa1\": container with ID starting with b746c3dfb12f2cf101fcbd435da413b115aa09af2b710a9c70e623e9e537efa1 not found: ID does not exist" Jan 31 09:07:32 crc kubenswrapper[4783]: I0131 09:07:32.052087 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xwzlk"] Jan 31 09:07:32 crc kubenswrapper[4783]: I0131 09:07:32.057527 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xwzlk"] Jan 31 09:07:33 crc kubenswrapper[4783]: I0131 09:07:33.653067 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3584910-fcf1-4d1d-a15c-8f4bc1a77809" path="/var/lib/kubelet/pods/c3584910-fcf1-4d1d-a15c-8f4bc1a77809/volumes" Jan 31 09:07:34 crc kubenswrapper[4783]: I0131 09:07:34.191516 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs\") pod \"network-metrics-daemon-xg6x2\" (UID: \"84961ed7-35f8-4e6a-987c-cabb84cf7268\") " pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:07:34 crc kubenswrapper[4783]: I0131 09:07:34.193311 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 31 09:07:34 crc kubenswrapper[4783]: I0131 09:07:34.207372 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84961ed7-35f8-4e6a-987c-cabb84cf7268-metrics-certs\") pod \"network-metrics-daemon-xg6x2\" (UID: \"84961ed7-35f8-4e6a-987c-cabb84cf7268\") " pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:07:34 crc kubenswrapper[4783]: I0131 09:07:34.462314 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 31 09:07:34 crc kubenswrapper[4783]: I0131 09:07:34.471191 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xg6x2" Jan 31 09:07:34 crc kubenswrapper[4783]: I0131 09:07:34.842969 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xg6x2"] Jan 31 09:07:34 crc kubenswrapper[4783]: W0131 09:07:34.849385 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84961ed7_35f8_4e6a_987c_cabb84cf7268.slice/crio-0869621a0f9f823fd6beda82cf13599a2f302e99daa9ec2b82f6fdd97da1650c WatchSource:0}: Error finding container 0869621a0f9f823fd6beda82cf13599a2f302e99daa9ec2b82f6fdd97da1650c: Status 404 returned error can't find the container with id 0869621a0f9f823fd6beda82cf13599a2f302e99daa9ec2b82f6fdd97da1650c Jan 31 09:07:35 crc kubenswrapper[4783]: I0131 09:07:35.753735 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" event={"ID":"84961ed7-35f8-4e6a-987c-cabb84cf7268","Type":"ContainerStarted","Data":"3d08b89c2c98e61f8b291195604a2cb544f09ff0bbd406b904e87586519e3404"} Jan 31 09:07:35 crc kubenswrapper[4783]: I0131 09:07:35.753777 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" event={"ID":"84961ed7-35f8-4e6a-987c-cabb84cf7268","Type":"ContainerStarted","Data":"18648c5cb41b33dab1b07bd5a4cdb5b25fb813454eebd88947f214a9ff867cc5"} Jan 31 09:07:35 crc kubenswrapper[4783]: I0131 09:07:35.753804 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xg6x2" event={"ID":"84961ed7-35f8-4e6a-987c-cabb84cf7268","Type":"ContainerStarted","Data":"0869621a0f9f823fd6beda82cf13599a2f302e99daa9ec2b82f6fdd97da1650c"} Jan 31 09:07:35 crc kubenswrapper[4783]: I0131 09:07:35.768756 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xg6x2" podStartSLOduration=143.768739476 podStartE2EDuration="2m23.768739476s" podCreationTimestamp="2026-01-31 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:07:35.765254141 +0000 UTC m=+166.433937609" watchObservedRunningTime="2026-01-31 09:07:35.768739476 +0000 UTC m=+166.437422944" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.071881 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 31 09:07:36 crc kubenswrapper[4783]: E0131 09:07:36.072119 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76827fef-ee7e-476c-82d3-8c43754e04d9" containerName="extract-content" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072134 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="76827fef-ee7e-476c-82d3-8c43754e04d9" containerName="extract-content" Jan 31 09:07:36 crc kubenswrapper[4783]: E0131 09:07:36.072155 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3584910-fcf1-4d1d-a15c-8f4bc1a77809" containerName="registry-server" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072186 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3584910-fcf1-4d1d-a15c-8f4bc1a77809" containerName="registry-server" Jan 31 09:07:36 crc kubenswrapper[4783]: E0131 09:07:36.072197 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3584910-fcf1-4d1d-a15c-8f4bc1a77809" containerName="extract-content" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072203 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3584910-fcf1-4d1d-a15c-8f4bc1a77809" containerName="extract-content" Jan 31 09:07:36 crc kubenswrapper[4783]: E0131 09:07:36.072214 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd47c3c5-ff77-4c21-b855-820a1aa46d05" containerName="registry-server" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072220 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd47c3c5-ff77-4c21-b855-820a1aa46d05" containerName="registry-server" Jan 31 09:07:36 crc kubenswrapper[4783]: E0131 09:07:36.072231 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76827fef-ee7e-476c-82d3-8c43754e04d9" containerName="extract-utilities" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072236 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="76827fef-ee7e-476c-82d3-8c43754e04d9" containerName="extract-utilities" Jan 31 09:07:36 crc kubenswrapper[4783]: E0131 09:07:36.072243 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d2558ed-aac8-452b-a38f-948fbac1e1dd" containerName="pruner" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072249 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d2558ed-aac8-452b-a38f-948fbac1e1dd" containerName="pruner" Jan 31 09:07:36 crc kubenswrapper[4783]: E0131 09:07:36.072258 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b452361-1b41-4ca1-9ce5-352dd7390d36" containerName="pruner" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072263 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b452361-1b41-4ca1-9ce5-352dd7390d36" containerName="pruner" Jan 31 09:07:36 crc kubenswrapper[4783]: E0131 09:07:36.072269 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd47c3c5-ff77-4c21-b855-820a1aa46d05" containerName="extract-utilities" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072276 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd47c3c5-ff77-4c21-b855-820a1aa46d05" containerName="extract-utilities" Jan 31 09:07:36 crc kubenswrapper[4783]: E0131 09:07:36.072283 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76827fef-ee7e-476c-82d3-8c43754e04d9" containerName="registry-server" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072290 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="76827fef-ee7e-476c-82d3-8c43754e04d9" containerName="registry-server" Jan 31 09:07:36 crc kubenswrapper[4783]: E0131 09:07:36.072299 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e052fd63-7a83-423f-84f9-6591c58046ce" containerName="registry-server" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072305 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e052fd63-7a83-423f-84f9-6591c58046ce" containerName="registry-server" Jan 31 09:07:36 crc kubenswrapper[4783]: E0131 09:07:36.072313 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e052fd63-7a83-423f-84f9-6591c58046ce" containerName="extract-utilities" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072318 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e052fd63-7a83-423f-84f9-6591c58046ce" containerName="extract-utilities" Jan 31 09:07:36 crc kubenswrapper[4783]: E0131 09:07:36.072325 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3584910-fcf1-4d1d-a15c-8f4bc1a77809" containerName="extract-utilities" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072330 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3584910-fcf1-4d1d-a15c-8f4bc1a77809" containerName="extract-utilities" Jan 31 09:07:36 crc kubenswrapper[4783]: E0131 09:07:36.072337 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e052fd63-7a83-423f-84f9-6591c58046ce" containerName="extract-content" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072343 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e052fd63-7a83-423f-84f9-6591c58046ce" containerName="extract-content" Jan 31 09:07:36 crc kubenswrapper[4783]: E0131 09:07:36.072349 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd47c3c5-ff77-4c21-b855-820a1aa46d05" containerName="extract-content" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072355 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd47c3c5-ff77-4c21-b855-820a1aa46d05" containerName="extract-content" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072439 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="e052fd63-7a83-423f-84f9-6591c58046ce" containerName="registry-server" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072447 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd47c3c5-ff77-4c21-b855-820a1aa46d05" containerName="registry-server" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072456 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b452361-1b41-4ca1-9ce5-352dd7390d36" containerName="pruner" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072464 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d2558ed-aac8-452b-a38f-948fbac1e1dd" containerName="pruner" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072472 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3584910-fcf1-4d1d-a15c-8f4bc1a77809" containerName="registry-server" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072480 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="76827fef-ee7e-476c-82d3-8c43754e04d9" containerName="registry-server" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.072891 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.075561 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.075761 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.082193 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.212763 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3129c6e8-8652-489c-a932-a188abb19046-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3129c6e8-8652-489c-a932-a188abb19046\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.212836 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3129c6e8-8652-489c-a932-a188abb19046-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3129c6e8-8652-489c-a932-a188abb19046\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.316179 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3129c6e8-8652-489c-a932-a188abb19046-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3129c6e8-8652-489c-a932-a188abb19046\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.316384 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3129c6e8-8652-489c-a932-a188abb19046-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3129c6e8-8652-489c-a932-a188abb19046\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.316537 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3129c6e8-8652-489c-a932-a188abb19046-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3129c6e8-8652-489c-a932-a188abb19046\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.338945 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3129c6e8-8652-489c-a932-a188abb19046-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3129c6e8-8652-489c-a932-a188abb19046\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.386004 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:07:36 crc kubenswrapper[4783]: I0131 09:07:36.774086 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 31 09:07:37 crc kubenswrapper[4783]: I0131 09:07:37.764045 4783 generic.go:334] "Generic (PLEG): container finished" podID="3129c6e8-8652-489c-a932-a188abb19046" containerID="1c22a1a987533851d9b34b2e2df3c22768f363fbf602fa059935c907a94d2b31" exitCode=0 Jan 31 09:07:37 crc kubenswrapper[4783]: I0131 09:07:37.764137 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3129c6e8-8652-489c-a932-a188abb19046","Type":"ContainerDied","Data":"1c22a1a987533851d9b34b2e2df3c22768f363fbf602fa059935c907a94d2b31"} Jan 31 09:07:37 crc kubenswrapper[4783]: I0131 09:07:37.764337 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3129c6e8-8652-489c-a932-a188abb19046","Type":"ContainerStarted","Data":"b1a26657b10af8facf90b8d571e1d16a15b234617c2f6e0e3aaee47761e79d1b"} Jan 31 09:07:39 crc kubenswrapper[4783]: I0131 09:07:39.014213 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:07:39 crc kubenswrapper[4783]: I0131 09:07:39.146301 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3129c6e8-8652-489c-a932-a188abb19046-kubelet-dir\") pod \"3129c6e8-8652-489c-a932-a188abb19046\" (UID: \"3129c6e8-8652-489c-a932-a188abb19046\") " Jan 31 09:07:39 crc kubenswrapper[4783]: I0131 09:07:39.146417 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3129c6e8-8652-489c-a932-a188abb19046-kube-api-access\") pod \"3129c6e8-8652-489c-a932-a188abb19046\" (UID: \"3129c6e8-8652-489c-a932-a188abb19046\") " Jan 31 09:07:39 crc kubenswrapper[4783]: I0131 09:07:39.146405 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3129c6e8-8652-489c-a932-a188abb19046-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3129c6e8-8652-489c-a932-a188abb19046" (UID: "3129c6e8-8652-489c-a932-a188abb19046"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:07:39 crc kubenswrapper[4783]: I0131 09:07:39.146797 4783 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3129c6e8-8652-489c-a932-a188abb19046-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:39 crc kubenswrapper[4783]: I0131 09:07:39.161747 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3129c6e8-8652-489c-a932-a188abb19046-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3129c6e8-8652-489c-a932-a188abb19046" (UID: "3129c6e8-8652-489c-a932-a188abb19046"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:07:39 crc kubenswrapper[4783]: I0131 09:07:39.247886 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3129c6e8-8652-489c-a932-a188abb19046-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:39 crc kubenswrapper[4783]: I0131 09:07:39.774795 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3129c6e8-8652-489c-a932-a188abb19046","Type":"ContainerDied","Data":"b1a26657b10af8facf90b8d571e1d16a15b234617c2f6e0e3aaee47761e79d1b"} Jan 31 09:07:39 crc kubenswrapper[4783]: I0131 09:07:39.774866 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1a26657b10af8facf90b8d571e1d16a15b234617c2f6e0e3aaee47761e79d1b" Jan 31 09:07:39 crc kubenswrapper[4783]: I0131 09:07:39.774834 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:07:41 crc kubenswrapper[4783]: I0131 09:07:41.870043 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 31 09:07:41 crc kubenswrapper[4783]: E0131 09:07:41.870499 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3129c6e8-8652-489c-a932-a188abb19046" containerName="pruner" Jan 31 09:07:41 crc kubenswrapper[4783]: I0131 09:07:41.870512 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="3129c6e8-8652-489c-a932-a188abb19046" containerName="pruner" Jan 31 09:07:41 crc kubenswrapper[4783]: I0131 09:07:41.870607 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="3129c6e8-8652-489c-a932-a188abb19046" containerName="pruner" Jan 31 09:07:41 crc kubenswrapper[4783]: I0131 09:07:41.870945 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:07:41 crc kubenswrapper[4783]: I0131 09:07:41.872882 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 31 09:07:41 crc kubenswrapper[4783]: I0131 09:07:41.875450 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 31 09:07:41 crc kubenswrapper[4783]: I0131 09:07:41.887731 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 31 09:07:41 crc kubenswrapper[4783]: I0131 09:07:41.975378 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-kube-api-access\") pod \"installer-9-crc\" (UID: \"4ecd2b3c-2ff6-4a90-b525-e262382bd09f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:07:41 crc kubenswrapper[4783]: I0131 09:07:41.975448 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4ecd2b3c-2ff6-4a90-b525-e262382bd09f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:07:41 crc kubenswrapper[4783]: I0131 09:07:41.975497 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-var-lock\") pod \"installer-9-crc\" (UID: \"4ecd2b3c-2ff6-4a90-b525-e262382bd09f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:07:42 crc kubenswrapper[4783]: I0131 09:07:42.076483 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4ecd2b3c-2ff6-4a90-b525-e262382bd09f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:07:42 crc kubenswrapper[4783]: I0131 09:07:42.076568 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-var-lock\") pod \"installer-9-crc\" (UID: \"4ecd2b3c-2ff6-4a90-b525-e262382bd09f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:07:42 crc kubenswrapper[4783]: I0131 09:07:42.076610 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-kube-api-access\") pod \"installer-9-crc\" (UID: \"4ecd2b3c-2ff6-4a90-b525-e262382bd09f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:07:42 crc kubenswrapper[4783]: I0131 09:07:42.076629 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4ecd2b3c-2ff6-4a90-b525-e262382bd09f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:07:42 crc kubenswrapper[4783]: I0131 09:07:42.076732 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-var-lock\") pod \"installer-9-crc\" (UID: \"4ecd2b3c-2ff6-4a90-b525-e262382bd09f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:07:42 crc kubenswrapper[4783]: I0131 09:07:42.092468 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-kube-api-access\") pod \"installer-9-crc\" (UID: \"4ecd2b3c-2ff6-4a90-b525-e262382bd09f\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:07:42 crc kubenswrapper[4783]: I0131 09:07:42.184014 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:07:42 crc kubenswrapper[4783]: I0131 09:07:42.548301 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 31 09:07:42 crc kubenswrapper[4783]: I0131 09:07:42.792474 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4ecd2b3c-2ff6-4a90-b525-e262382bd09f","Type":"ContainerStarted","Data":"0b4a55938dbc01bc8c33bc7b0ab07099028fd4199c59f764d0763efec34e7a17"} Jan 31 09:07:42 crc kubenswrapper[4783]: I0131 09:07:42.792849 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4ecd2b3c-2ff6-4a90-b525-e262382bd09f","Type":"ContainerStarted","Data":"56f865689e232d3f110b0bb7a2b0ae8f888f8e476c1b5c1557c5531e81ec11d3"} Jan 31 09:07:42 crc kubenswrapper[4783]: I0131 09:07:42.818382 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.8183547039999999 podStartE2EDuration="1.818354704s" podCreationTimestamp="2026-01-31 09:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:07:42.814774159 +0000 UTC m=+173.483457627" watchObservedRunningTime="2026-01-31 09:07:42.818354704 +0000 UTC m=+173.487038172" Jan 31 09:07:47 crc kubenswrapper[4783]: I0131 09:07:47.757159 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:07:47 crc kubenswrapper[4783]: I0131 09:07:47.757461 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:07:48 crc kubenswrapper[4783]: I0131 09:07:48.860696 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" podUID="f100d6ab-c3b2-4712-b2d3-370287baadb4" containerName="oauth-openshift" containerID="cri-o://9ddcdd900cc8dca3f8caa79f7b2985059e791b694da6ae4879f8e12188acefc6" gracePeriod=15 Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.171863 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.254587 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-service-ca\") pod \"f100d6ab-c3b2-4712-b2d3-370287baadb4\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.254642 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-provider-selection\") pod \"f100d6ab-c3b2-4712-b2d3-370287baadb4\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.254707 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-session\") pod \"f100d6ab-c3b2-4712-b2d3-370287baadb4\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.255308 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f100d6ab-c3b2-4712-b2d3-370287baadb4" (UID: "f100d6ab-c3b2-4712-b2d3-370287baadb4"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.255307 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sdf2\" (UniqueName: \"kubernetes.io/projected/f100d6ab-c3b2-4712-b2d3-370287baadb4-kube-api-access-4sdf2\") pod \"f100d6ab-c3b2-4712-b2d3-370287baadb4\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.255402 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-trusted-ca-bundle\") pod \"f100d6ab-c3b2-4712-b2d3-370287baadb4\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.255427 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f100d6ab-c3b2-4712-b2d3-370287baadb4-audit-dir\") pod \"f100d6ab-c3b2-4712-b2d3-370287baadb4\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.255456 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-router-certs\") pod \"f100d6ab-c3b2-4712-b2d3-370287baadb4\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.255477 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-cliconfig\") pod \"f100d6ab-c3b2-4712-b2d3-370287baadb4\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.255503 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-idp-0-file-data\") pod \"f100d6ab-c3b2-4712-b2d3-370287baadb4\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.255533 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-serving-cert\") pod \"f100d6ab-c3b2-4712-b2d3-370287baadb4\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.255545 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f100d6ab-c3b2-4712-b2d3-370287baadb4-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f100d6ab-c3b2-4712-b2d3-370287baadb4" (UID: "f100d6ab-c3b2-4712-b2d3-370287baadb4"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.255581 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-login\") pod \"f100d6ab-c3b2-4712-b2d3-370287baadb4\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.255691 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-ocp-branding-template\") pod \"f100d6ab-c3b2-4712-b2d3-370287baadb4\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.255741 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-error\") pod \"f100d6ab-c3b2-4712-b2d3-370287baadb4\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.255797 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-audit-policies\") pod \"f100d6ab-c3b2-4712-b2d3-370287baadb4\" (UID: \"f100d6ab-c3b2-4712-b2d3-370287baadb4\") " Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.255976 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f100d6ab-c3b2-4712-b2d3-370287baadb4" (UID: "f100d6ab-c3b2-4712-b2d3-370287baadb4"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.256337 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.256358 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.256370 4783 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f100d6ab-c3b2-4712-b2d3-370287baadb4-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.256543 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f100d6ab-c3b2-4712-b2d3-370287baadb4" (UID: "f100d6ab-c3b2-4712-b2d3-370287baadb4"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.256728 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f100d6ab-c3b2-4712-b2d3-370287baadb4" (UID: "f100d6ab-c3b2-4712-b2d3-370287baadb4"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.259862 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f100d6ab-c3b2-4712-b2d3-370287baadb4" (UID: "f100d6ab-c3b2-4712-b2d3-370287baadb4"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.260395 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f100d6ab-c3b2-4712-b2d3-370287baadb4" (UID: "f100d6ab-c3b2-4712-b2d3-370287baadb4"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.260483 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f100d6ab-c3b2-4712-b2d3-370287baadb4" (UID: "f100d6ab-c3b2-4712-b2d3-370287baadb4"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.260612 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f100d6ab-c3b2-4712-b2d3-370287baadb4-kube-api-access-4sdf2" (OuterVolumeSpecName: "kube-api-access-4sdf2") pod "f100d6ab-c3b2-4712-b2d3-370287baadb4" (UID: "f100d6ab-c3b2-4712-b2d3-370287baadb4"). InnerVolumeSpecName "kube-api-access-4sdf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.260884 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f100d6ab-c3b2-4712-b2d3-370287baadb4" (UID: "f100d6ab-c3b2-4712-b2d3-370287baadb4"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.261197 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f100d6ab-c3b2-4712-b2d3-370287baadb4" (UID: "f100d6ab-c3b2-4712-b2d3-370287baadb4"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.261349 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f100d6ab-c3b2-4712-b2d3-370287baadb4" (UID: "f100d6ab-c3b2-4712-b2d3-370287baadb4"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.261563 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f100d6ab-c3b2-4712-b2d3-370287baadb4" (UID: "f100d6ab-c3b2-4712-b2d3-370287baadb4"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.262020 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f100d6ab-c3b2-4712-b2d3-370287baadb4" (UID: "f100d6ab-c3b2-4712-b2d3-370287baadb4"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.357077 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.357133 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.357147 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sdf2\" (UniqueName: \"kubernetes.io/projected/f100d6ab-c3b2-4712-b2d3-370287baadb4-kube-api-access-4sdf2\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.357178 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.357191 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.357202 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.357213 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.357238 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.357249 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.357259 4783 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f100d6ab-c3b2-4712-b2d3-370287baadb4-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.357271 4783 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f100d6ab-c3b2-4712-b2d3-370287baadb4-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.844367 4783 generic.go:334] "Generic (PLEG): container finished" podID="f100d6ab-c3b2-4712-b2d3-370287baadb4" containerID="9ddcdd900cc8dca3f8caa79f7b2985059e791b694da6ae4879f8e12188acefc6" exitCode=0 Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.844463 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.844461 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" event={"ID":"f100d6ab-c3b2-4712-b2d3-370287baadb4","Type":"ContainerDied","Data":"9ddcdd900cc8dca3f8caa79f7b2985059e791b694da6ae4879f8e12188acefc6"} Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.844840 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7csp2" event={"ID":"f100d6ab-c3b2-4712-b2d3-370287baadb4","Type":"ContainerDied","Data":"611c376cdf6854afd499618d547c89b888bf4c401bd54406ea49d33cfaf2857b"} Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.844885 4783 scope.go:117] "RemoveContainer" containerID="9ddcdd900cc8dca3f8caa79f7b2985059e791b694da6ae4879f8e12188acefc6" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.860997 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7csp2"] Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.862768 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7csp2"] Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.866071 4783 scope.go:117] "RemoveContainer" containerID="9ddcdd900cc8dca3f8caa79f7b2985059e791b694da6ae4879f8e12188acefc6" Jan 31 09:07:49 crc kubenswrapper[4783]: E0131 09:07:49.866582 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ddcdd900cc8dca3f8caa79f7b2985059e791b694da6ae4879f8e12188acefc6\": container with ID starting with 9ddcdd900cc8dca3f8caa79f7b2985059e791b694da6ae4879f8e12188acefc6 not found: ID does not exist" containerID="9ddcdd900cc8dca3f8caa79f7b2985059e791b694da6ae4879f8e12188acefc6" Jan 31 09:07:49 crc kubenswrapper[4783]: I0131 09:07:49.866637 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ddcdd900cc8dca3f8caa79f7b2985059e791b694da6ae4879f8e12188acefc6"} err="failed to get container status \"9ddcdd900cc8dca3f8caa79f7b2985059e791b694da6ae4879f8e12188acefc6\": rpc error: code = NotFound desc = could not find container \"9ddcdd900cc8dca3f8caa79f7b2985059e791b694da6ae4879f8e12188acefc6\": container with ID starting with 9ddcdd900cc8dca3f8caa79f7b2985059e791b694da6ae4879f8e12188acefc6 not found: ID does not exist" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.653353 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f100d6ab-c3b2-4712-b2d3-370287baadb4" path="/var/lib/kubelet/pods/f100d6ab-c3b2-4712-b2d3-370287baadb4/volumes" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.932147 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-79656f7ff7-qlhbq"] Jan 31 09:07:51 crc kubenswrapper[4783]: E0131 09:07:51.932384 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f100d6ab-c3b2-4712-b2d3-370287baadb4" containerName="oauth-openshift" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.932397 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f100d6ab-c3b2-4712-b2d3-370287baadb4" containerName="oauth-openshift" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.932505 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="f100d6ab-c3b2-4712-b2d3-370287baadb4" containerName="oauth-openshift" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.933074 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.934723 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.936412 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.936429 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.936453 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.936455 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.936664 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.936675 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.936779 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.936949 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.936976 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.937024 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.937537 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.942918 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.943756 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79656f7ff7-qlhbq"] Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.946192 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.947624 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.986976 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-service-ca\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.987029 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48daab9e-f491-45d1-abf4-c82d736fce0a-audit-dir\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.987060 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/48daab9e-f491-45d1-abf4-c82d736fce0a-audit-policies\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.987081 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.987098 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26wdm\" (UniqueName: \"kubernetes.io/projected/48daab9e-f491-45d1-abf4-c82d736fce0a-kube-api-access-26wdm\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.987118 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-router-certs\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.987133 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.987153 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-user-template-login\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.987194 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.987216 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.987254 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-session\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.987273 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.987290 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-user-template-error\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:51 crc kubenswrapper[4783]: I0131 09:07:51.987314 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.088340 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.089082 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-session\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.089214 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.089314 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-user-template-error\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.089417 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.089513 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-service-ca\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.089599 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48daab9e-f491-45d1-abf4-c82d736fce0a-audit-dir\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.089683 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/48daab9e-f491-45d1-abf4-c82d736fce0a-audit-policies\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.089762 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26wdm\" (UniqueName: \"kubernetes.io/projected/48daab9e-f491-45d1-abf4-c82d736fce0a-kube-api-access-26wdm\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.089839 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.089926 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-router-certs\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.089996 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.090069 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-user-template-login\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.090153 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.092732 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.093085 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.093264 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-service-ca\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.093427 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48daab9e-f491-45d1-abf4-c82d736fce0a-audit-dir\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.093765 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-user-template-error\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.093732 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.093882 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/48daab9e-f491-45d1-abf4-c82d736fce0a-audit-policies\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.096027 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-router-certs\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.096402 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.099637 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-user-template-login\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.102485 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-session\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.105901 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.107125 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/48daab9e-f491-45d1-abf4-c82d736fce0a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.108443 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26wdm\" (UniqueName: \"kubernetes.io/projected/48daab9e-f491-45d1-abf4-c82d736fce0a-kube-api-access-26wdm\") pod \"oauth-openshift-79656f7ff7-qlhbq\" (UID: \"48daab9e-f491-45d1-abf4-c82d736fce0a\") " pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.248416 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.589036 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79656f7ff7-qlhbq"] Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.862849 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" event={"ID":"48daab9e-f491-45d1-abf4-c82d736fce0a","Type":"ContainerStarted","Data":"bbd722d187654d0b05e7e6e79d65fc121da2998e297e435864c303b8f9c46b5d"} Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.864138 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.864226 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" event={"ID":"48daab9e-f491-45d1-abf4-c82d736fce0a","Type":"ContainerStarted","Data":"f24c46765b32f8325a8309f07c0f215c4daaf0cca7cde897682645ff9e6bb8ee"} Jan 31 09:07:52 crc kubenswrapper[4783]: I0131 09:07:52.883426 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" podStartSLOduration=29.883410897 podStartE2EDuration="29.883410897s" podCreationTimestamp="2026-01-31 09:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:07:52.879069382 +0000 UTC m=+183.547752851" watchObservedRunningTime="2026-01-31 09:07:52.883410897 +0000 UTC m=+183.552094365" Jan 31 09:07:53 crc kubenswrapper[4783]: I0131 09:07:53.135726 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-79656f7ff7-qlhbq" Jan 31 09:07:55 crc kubenswrapper[4783]: I0131 09:07:55.877875 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.382973 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2whxk"] Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.384093 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2whxk" podUID="67f8d4b0-8393-4fbc-bb6d-f7d321645e9e" containerName="registry-server" containerID="cri-o://772da3b44ce02bfd6d368e6277bd14f3e0d178f1e6ad5fa25541fe5f3c873631" gracePeriod=30 Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.390394 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xxtnp"] Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.392340 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xxtnp" podUID="58cd7170-ffdf-4cf8-9d2c-4e0251ada36e" containerName="registry-server" containerID="cri-o://92b612bb9113e4400c98300d96cc50325f00925d7f5a7e7ad73dadedc46f8f67" gracePeriod=30 Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.402787 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lkj9z"] Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.403001 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" podUID="7cf20dc6-2184-41b1-a943-f917dafb36b4" containerName="marketplace-operator" containerID="cri-o://7e22bbd0ae25dff0c4c1ae3463e4f50a7aeba0f65fd5e78b2a8a09658f0dc543" gracePeriod=30 Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.415824 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4h5q"] Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.417098 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s4h5q" podUID="a0fca12d-3abf-4543-b5f7-205f4bd75149" containerName="registry-server" containerID="cri-o://0e67d147a8d893d6997fa01b19f3b8bf076ac990be50d060cb33ad37612bc2d9" gracePeriod=30 Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.419733 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-729tq"] Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.420665 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-729tq" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.421405 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zzqtt"] Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.421661 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zzqtt" podUID="108dfc0b-86ff-45c1-8d9f-a879d585ddff" containerName="registry-server" containerID="cri-o://2ced1fefb79907b94a28218e799f02c08457fa9a73f9a38bc290d04b0b61692c" gracePeriod=30 Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.429615 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-729tq"] Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.539084 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6e7fa19e-aa64-4479-805e-62625ccc19b8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-729tq\" (UID: \"6e7fa19e-aa64-4479-805e-62625ccc19b8\") " pod="openshift-marketplace/marketplace-operator-79b997595-729tq" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.539512 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e7fa19e-aa64-4479-805e-62625ccc19b8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-729tq\" (UID: \"6e7fa19e-aa64-4479-805e-62625ccc19b8\") " pod="openshift-marketplace/marketplace-operator-79b997595-729tq" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.539607 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll5tv\" (UniqueName: \"kubernetes.io/projected/6e7fa19e-aa64-4479-805e-62625ccc19b8-kube-api-access-ll5tv\") pod \"marketplace-operator-79b997595-729tq\" (UID: \"6e7fa19e-aa64-4479-805e-62625ccc19b8\") " pod="openshift-marketplace/marketplace-operator-79b997595-729tq" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.640637 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll5tv\" (UniqueName: \"kubernetes.io/projected/6e7fa19e-aa64-4479-805e-62625ccc19b8-kube-api-access-ll5tv\") pod \"marketplace-operator-79b997595-729tq\" (UID: \"6e7fa19e-aa64-4479-805e-62625ccc19b8\") " pod="openshift-marketplace/marketplace-operator-79b997595-729tq" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.640699 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6e7fa19e-aa64-4479-805e-62625ccc19b8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-729tq\" (UID: \"6e7fa19e-aa64-4479-805e-62625ccc19b8\") " pod="openshift-marketplace/marketplace-operator-79b997595-729tq" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.640758 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e7fa19e-aa64-4479-805e-62625ccc19b8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-729tq\" (UID: \"6e7fa19e-aa64-4479-805e-62625ccc19b8\") " pod="openshift-marketplace/marketplace-operator-79b997595-729tq" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.642245 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e7fa19e-aa64-4479-805e-62625ccc19b8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-729tq\" (UID: \"6e7fa19e-aa64-4479-805e-62625ccc19b8\") " pod="openshift-marketplace/marketplace-operator-79b997595-729tq" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.647282 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6e7fa19e-aa64-4479-805e-62625ccc19b8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-729tq\" (UID: \"6e7fa19e-aa64-4479-805e-62625ccc19b8\") " pod="openshift-marketplace/marketplace-operator-79b997595-729tq" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.654277 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll5tv\" (UniqueName: \"kubernetes.io/projected/6e7fa19e-aa64-4479-805e-62625ccc19b8-kube-api-access-ll5tv\") pod \"marketplace-operator-79b997595-729tq\" (UID: \"6e7fa19e-aa64-4479-805e-62625ccc19b8\") " pod="openshift-marketplace/marketplace-operator-79b997595-729tq" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.737435 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-729tq" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.766730 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.828299 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.836126 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.837638 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.842090 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.943135 4783 generic.go:334] "Generic (PLEG): container finished" podID="58cd7170-ffdf-4cf8-9d2c-4e0251ada36e" containerID="92b612bb9113e4400c98300d96cc50325f00925d7f5a7e7ad73dadedc46f8f67" exitCode=0 Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.943263 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxtnp" event={"ID":"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e","Type":"ContainerDied","Data":"92b612bb9113e4400c98300d96cc50325f00925d7f5a7e7ad73dadedc46f8f67"} Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.943308 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxtnp" event={"ID":"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e","Type":"ContainerDied","Data":"b07faea632ef4e32fa16a49de06c4dfb6c79aa876b198b86ad9b2275ec286fbc"} Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.943365 4783 scope.go:117] "RemoveContainer" containerID="92b612bb9113e4400c98300d96cc50325f00925d7f5a7e7ad73dadedc46f8f67" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.943538 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxtnp" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.944291 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9t6g\" (UniqueName: \"kubernetes.io/projected/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-kube-api-access-p9t6g\") pod \"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e\" (UID: \"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e\") " Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.944452 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/108dfc0b-86ff-45c1-8d9f-a879d585ddff-catalog-content\") pod \"108dfc0b-86ff-45c1-8d9f-a879d585ddff\" (UID: \"108dfc0b-86ff-45c1-8d9f-a879d585ddff\") " Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.944519 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlgjm\" (UniqueName: \"kubernetes.io/projected/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-kube-api-access-rlgjm\") pod \"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e\" (UID: \"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e\") " Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.944555 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7cf20dc6-2184-41b1-a943-f917dafb36b4-marketplace-trusted-ca\") pod \"7cf20dc6-2184-41b1-a943-f917dafb36b4\" (UID: \"7cf20dc6-2184-41b1-a943-f917dafb36b4\") " Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.944599 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0fca12d-3abf-4543-b5f7-205f4bd75149-catalog-content\") pod \"a0fca12d-3abf-4543-b5f7-205f4bd75149\" (UID: \"a0fca12d-3abf-4543-b5f7-205f4bd75149\") " Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.944627 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0fca12d-3abf-4543-b5f7-205f4bd75149-utilities\") pod \"a0fca12d-3abf-4543-b5f7-205f4bd75149\" (UID: \"a0fca12d-3abf-4543-b5f7-205f4bd75149\") " Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.944679 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/108dfc0b-86ff-45c1-8d9f-a879d585ddff-utilities\") pod \"108dfc0b-86ff-45c1-8d9f-a879d585ddff\" (UID: \"108dfc0b-86ff-45c1-8d9f-a879d585ddff\") " Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.944775 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc9fk\" (UniqueName: \"kubernetes.io/projected/108dfc0b-86ff-45c1-8d9f-a879d585ddff-kube-api-access-cc9fk\") pod \"108dfc0b-86ff-45c1-8d9f-a879d585ddff\" (UID: \"108dfc0b-86ff-45c1-8d9f-a879d585ddff\") " Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.944811 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-catalog-content\") pod \"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e\" (UID: \"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e\") " Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.944869 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4std2\" (UniqueName: \"kubernetes.io/projected/7cf20dc6-2184-41b1-a943-f917dafb36b4-kube-api-access-4std2\") pod \"7cf20dc6-2184-41b1-a943-f917dafb36b4\" (UID: \"7cf20dc6-2184-41b1-a943-f917dafb36b4\") " Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.944910 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4cf4\" (UniqueName: \"kubernetes.io/projected/a0fca12d-3abf-4543-b5f7-205f4bd75149-kube-api-access-l4cf4\") pod \"a0fca12d-3abf-4543-b5f7-205f4bd75149\" (UID: \"a0fca12d-3abf-4543-b5f7-205f4bd75149\") " Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.944938 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-catalog-content\") pod \"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e\" (UID: \"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e\") " Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.944987 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-utilities\") pod \"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e\" (UID: \"58cd7170-ffdf-4cf8-9d2c-4e0251ada36e\") " Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.945020 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-utilities\") pod \"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e\" (UID: \"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e\") " Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.945075 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7cf20dc6-2184-41b1-a943-f917dafb36b4-marketplace-operator-metrics\") pod \"7cf20dc6-2184-41b1-a943-f917dafb36b4\" (UID: \"7cf20dc6-2184-41b1-a943-f917dafb36b4\") " Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.946108 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0fca12d-3abf-4543-b5f7-205f4bd75149-utilities" (OuterVolumeSpecName: "utilities") pod "a0fca12d-3abf-4543-b5f7-205f4bd75149" (UID: "a0fca12d-3abf-4543-b5f7-205f4bd75149"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.946793 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cf20dc6-2184-41b1-a943-f917dafb36b4-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "7cf20dc6-2184-41b1-a943-f917dafb36b4" (UID: "7cf20dc6-2184-41b1-a943-f917dafb36b4"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.949275 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/108dfc0b-86ff-45c1-8d9f-a879d585ddff-utilities" (OuterVolumeSpecName: "utilities") pod "108dfc0b-86ff-45c1-8d9f-a879d585ddff" (UID: "108dfc0b-86ff-45c1-8d9f-a879d585ddff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.949745 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-utilities" (OuterVolumeSpecName: "utilities") pod "58cd7170-ffdf-4cf8-9d2c-4e0251ada36e" (UID: "58cd7170-ffdf-4cf8-9d2c-4e0251ada36e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.949977 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf20dc6-2184-41b1-a943-f917dafb36b4-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "7cf20dc6-2184-41b1-a943-f917dafb36b4" (UID: "7cf20dc6-2184-41b1-a943-f917dafb36b4"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.950265 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-utilities" (OuterVolumeSpecName: "utilities") pod "67f8d4b0-8393-4fbc-bb6d-f7d321645e9e" (UID: "67f8d4b0-8393-4fbc-bb6d-f7d321645e9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.950497 4783 generic.go:334] "Generic (PLEG): container finished" podID="7cf20dc6-2184-41b1-a943-f917dafb36b4" containerID="7e22bbd0ae25dff0c4c1ae3463e4f50a7aeba0f65fd5e78b2a8a09658f0dc543" exitCode=0 Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.950570 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.950660 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" event={"ID":"7cf20dc6-2184-41b1-a943-f917dafb36b4","Type":"ContainerDied","Data":"7e22bbd0ae25dff0c4c1ae3463e4f50a7aeba0f65fd5e78b2a8a09658f0dc543"} Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.950867 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lkj9z" event={"ID":"7cf20dc6-2184-41b1-a943-f917dafb36b4","Type":"ContainerDied","Data":"f82906df5be8f70c9f0319456b0a6ada2d3989c8fad437f7e9159f4ecedda1be"} Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.951725 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-kube-api-access-p9t6g" (OuterVolumeSpecName: "kube-api-access-p9t6g") pod "58cd7170-ffdf-4cf8-9d2c-4e0251ada36e" (UID: "58cd7170-ffdf-4cf8-9d2c-4e0251ada36e"). InnerVolumeSpecName "kube-api-access-p9t6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.953498 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf20dc6-2184-41b1-a943-f917dafb36b4-kube-api-access-4std2" (OuterVolumeSpecName: "kube-api-access-4std2") pod "7cf20dc6-2184-41b1-a943-f917dafb36b4" (UID: "7cf20dc6-2184-41b1-a943-f917dafb36b4"). InnerVolumeSpecName "kube-api-access-4std2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.953921 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0fca12d-3abf-4543-b5f7-205f4bd75149-kube-api-access-l4cf4" (OuterVolumeSpecName: "kube-api-access-l4cf4") pod "a0fca12d-3abf-4543-b5f7-205f4bd75149" (UID: "a0fca12d-3abf-4543-b5f7-205f4bd75149"). InnerVolumeSpecName "kube-api-access-l4cf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.954290 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/108dfc0b-86ff-45c1-8d9f-a879d585ddff-kube-api-access-cc9fk" (OuterVolumeSpecName: "kube-api-access-cc9fk") pod "108dfc0b-86ff-45c1-8d9f-a879d585ddff" (UID: "108dfc0b-86ff-45c1-8d9f-a879d585ddff"). InnerVolumeSpecName "kube-api-access-cc9fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.954466 4783 generic.go:334] "Generic (PLEG): container finished" podID="67f8d4b0-8393-4fbc-bb6d-f7d321645e9e" containerID="772da3b44ce02bfd6d368e6277bd14f3e0d178f1e6ad5fa25541fe5f3c873631" exitCode=0 Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.954578 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2whxk" event={"ID":"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e","Type":"ContainerDied","Data":"772da3b44ce02bfd6d368e6277bd14f3e0d178f1e6ad5fa25541fe5f3c873631"} Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.954602 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2whxk" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.954642 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2whxk" event={"ID":"67f8d4b0-8393-4fbc-bb6d-f7d321645e9e","Type":"ContainerDied","Data":"446112daa90169d02c98ba6acb57fb896c72c238703654bf17158e7a68a395b6"} Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.955389 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-kube-api-access-rlgjm" (OuterVolumeSpecName: "kube-api-access-rlgjm") pod "67f8d4b0-8393-4fbc-bb6d-f7d321645e9e" (UID: "67f8d4b0-8393-4fbc-bb6d-f7d321645e9e"). InnerVolumeSpecName "kube-api-access-rlgjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.958434 4783 generic.go:334] "Generic (PLEG): container finished" podID="a0fca12d-3abf-4543-b5f7-205f4bd75149" containerID="0e67d147a8d893d6997fa01b19f3b8bf076ac990be50d060cb33ad37612bc2d9" exitCode=0 Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.958555 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4h5q" event={"ID":"a0fca12d-3abf-4543-b5f7-205f4bd75149","Type":"ContainerDied","Data":"0e67d147a8d893d6997fa01b19f3b8bf076ac990be50d060cb33ad37612bc2d9"} Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.958601 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4h5q" event={"ID":"a0fca12d-3abf-4543-b5f7-205f4bd75149","Type":"ContainerDied","Data":"d86ce69f65e312dcc15bef8fcda51e894c5d368c5e1e65986a4fa51d4e51b238"} Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.959069 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s4h5q" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.966386 4783 generic.go:334] "Generic (PLEG): container finished" podID="108dfc0b-86ff-45c1-8d9f-a879d585ddff" containerID="2ced1fefb79907b94a28218e799f02c08457fa9a73f9a38bc290d04b0b61692c" exitCode=0 Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.966450 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzqtt" event={"ID":"108dfc0b-86ff-45c1-8d9f-a879d585ddff","Type":"ContainerDied","Data":"2ced1fefb79907b94a28218e799f02c08457fa9a73f9a38bc290d04b0b61692c"} Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.966498 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzqtt" event={"ID":"108dfc0b-86ff-45c1-8d9f-a879d585ddff","Type":"ContainerDied","Data":"1e0c861067154e0d5e2d04c3ccc3970c55807992ae7cec4e8d5c08217b4bf84b"} Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.966593 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zzqtt" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.976136 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0fca12d-3abf-4543-b5f7-205f4bd75149-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0fca12d-3abf-4543-b5f7-205f4bd75149" (UID: "a0fca12d-3abf-4543-b5f7-205f4bd75149"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:08:04 crc kubenswrapper[4783]: I0131 09:08:04.983218 4783 scope.go:117] "RemoveContainer" containerID="e821ffdcc0ee0dcbcf184db0acbc57eeeb58c083c3ff161fd240dfbd71c2e274" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.002250 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67f8d4b0-8393-4fbc-bb6d-f7d321645e9e" (UID: "67f8d4b0-8393-4fbc-bb6d-f7d321645e9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.002889 4783 scope.go:117] "RemoveContainer" containerID="913b3b3602379ff062311a2c4275b0559d5d9ee04d03e80b0b071cd964757fd5" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.015785 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58cd7170-ffdf-4cf8-9d2c-4e0251ada36e" (UID: "58cd7170-ffdf-4cf8-9d2c-4e0251ada36e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.019036 4783 scope.go:117] "RemoveContainer" containerID="92b612bb9113e4400c98300d96cc50325f00925d7f5a7e7ad73dadedc46f8f67" Jan 31 09:08:05 crc kubenswrapper[4783]: E0131 09:08:05.019487 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92b612bb9113e4400c98300d96cc50325f00925d7f5a7e7ad73dadedc46f8f67\": container with ID starting with 92b612bb9113e4400c98300d96cc50325f00925d7f5a7e7ad73dadedc46f8f67 not found: ID does not exist" containerID="92b612bb9113e4400c98300d96cc50325f00925d7f5a7e7ad73dadedc46f8f67" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.019610 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92b612bb9113e4400c98300d96cc50325f00925d7f5a7e7ad73dadedc46f8f67"} err="failed to get container status \"92b612bb9113e4400c98300d96cc50325f00925d7f5a7e7ad73dadedc46f8f67\": rpc error: code = NotFound desc = could not find container \"92b612bb9113e4400c98300d96cc50325f00925d7f5a7e7ad73dadedc46f8f67\": container with ID starting with 92b612bb9113e4400c98300d96cc50325f00925d7f5a7e7ad73dadedc46f8f67 not found: ID does not exist" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.019716 4783 scope.go:117] "RemoveContainer" containerID="e821ffdcc0ee0dcbcf184db0acbc57eeeb58c083c3ff161fd240dfbd71c2e274" Jan 31 09:08:05 crc kubenswrapper[4783]: E0131 09:08:05.020135 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e821ffdcc0ee0dcbcf184db0acbc57eeeb58c083c3ff161fd240dfbd71c2e274\": container with ID starting with e821ffdcc0ee0dcbcf184db0acbc57eeeb58c083c3ff161fd240dfbd71c2e274 not found: ID does not exist" containerID="e821ffdcc0ee0dcbcf184db0acbc57eeeb58c083c3ff161fd240dfbd71c2e274" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.020198 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e821ffdcc0ee0dcbcf184db0acbc57eeeb58c083c3ff161fd240dfbd71c2e274"} err="failed to get container status \"e821ffdcc0ee0dcbcf184db0acbc57eeeb58c083c3ff161fd240dfbd71c2e274\": rpc error: code = NotFound desc = could not find container \"e821ffdcc0ee0dcbcf184db0acbc57eeeb58c083c3ff161fd240dfbd71c2e274\": container with ID starting with e821ffdcc0ee0dcbcf184db0acbc57eeeb58c083c3ff161fd240dfbd71c2e274 not found: ID does not exist" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.020234 4783 scope.go:117] "RemoveContainer" containerID="913b3b3602379ff062311a2c4275b0559d5d9ee04d03e80b0b071cd964757fd5" Jan 31 09:08:05 crc kubenswrapper[4783]: E0131 09:08:05.020716 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"913b3b3602379ff062311a2c4275b0559d5d9ee04d03e80b0b071cd964757fd5\": container with ID starting with 913b3b3602379ff062311a2c4275b0559d5d9ee04d03e80b0b071cd964757fd5 not found: ID does not exist" containerID="913b3b3602379ff062311a2c4275b0559d5d9ee04d03e80b0b071cd964757fd5" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.020751 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913b3b3602379ff062311a2c4275b0559d5d9ee04d03e80b0b071cd964757fd5"} err="failed to get container status \"913b3b3602379ff062311a2c4275b0559d5d9ee04d03e80b0b071cd964757fd5\": rpc error: code = NotFound desc = could not find container \"913b3b3602379ff062311a2c4275b0559d5d9ee04d03e80b0b071cd964757fd5\": container with ID starting with 913b3b3602379ff062311a2c4275b0559d5d9ee04d03e80b0b071cd964757fd5 not found: ID does not exist" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.020780 4783 scope.go:117] "RemoveContainer" containerID="7e22bbd0ae25dff0c4c1ae3463e4f50a7aeba0f65fd5e78b2a8a09658f0dc543" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.039491 4783 scope.go:117] "RemoveContainer" containerID="7e22bbd0ae25dff0c4c1ae3463e4f50a7aeba0f65fd5e78b2a8a09658f0dc543" Jan 31 09:08:05 crc kubenswrapper[4783]: E0131 09:08:05.039795 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e22bbd0ae25dff0c4c1ae3463e4f50a7aeba0f65fd5e78b2a8a09658f0dc543\": container with ID starting with 7e22bbd0ae25dff0c4c1ae3463e4f50a7aeba0f65fd5e78b2a8a09658f0dc543 not found: ID does not exist" containerID="7e22bbd0ae25dff0c4c1ae3463e4f50a7aeba0f65fd5e78b2a8a09658f0dc543" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.039832 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e22bbd0ae25dff0c4c1ae3463e4f50a7aeba0f65fd5e78b2a8a09658f0dc543"} err="failed to get container status \"7e22bbd0ae25dff0c4c1ae3463e4f50a7aeba0f65fd5e78b2a8a09658f0dc543\": rpc error: code = NotFound desc = could not find container \"7e22bbd0ae25dff0c4c1ae3463e4f50a7aeba0f65fd5e78b2a8a09658f0dc543\": container with ID starting with 7e22bbd0ae25dff0c4c1ae3463e4f50a7aeba0f65fd5e78b2a8a09658f0dc543 not found: ID does not exist" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.039856 4783 scope.go:117] "RemoveContainer" containerID="772da3b44ce02bfd6d368e6277bd14f3e0d178f1e6ad5fa25541fe5f3c873631" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.046574 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc9fk\" (UniqueName: \"kubernetes.io/projected/108dfc0b-86ff-45c1-8d9f-a879d585ddff-kube-api-access-cc9fk\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.046686 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.046756 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4std2\" (UniqueName: \"kubernetes.io/projected/7cf20dc6-2184-41b1-a943-f917dafb36b4-kube-api-access-4std2\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.046834 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4cf4\" (UniqueName: \"kubernetes.io/projected/a0fca12d-3abf-4543-b5f7-205f4bd75149-kube-api-access-l4cf4\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.046887 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.046940 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.046996 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.047053 4783 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7cf20dc6-2184-41b1-a943-f917dafb36b4-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.047105 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9t6g\" (UniqueName: \"kubernetes.io/projected/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e-kube-api-access-p9t6g\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.047190 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlgjm\" (UniqueName: \"kubernetes.io/projected/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e-kube-api-access-rlgjm\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.047253 4783 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7cf20dc6-2184-41b1-a943-f917dafb36b4-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.047358 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0fca12d-3abf-4543-b5f7-205f4bd75149-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.047445 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0fca12d-3abf-4543-b5f7-205f4bd75149-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.047532 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/108dfc0b-86ff-45c1-8d9f-a879d585ddff-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.052971 4783 scope.go:117] "RemoveContainer" containerID="3f2b0714331effaade16b4994075e99113926a1c5fbcfb664f01b689a49d3c11" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.073436 4783 scope.go:117] "RemoveContainer" containerID="50d495110117ea5b3777dfc01ba245a030f62ffb6f0c39fba98e19f16932bc95" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.079953 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/108dfc0b-86ff-45c1-8d9f-a879d585ddff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "108dfc0b-86ff-45c1-8d9f-a879d585ddff" (UID: "108dfc0b-86ff-45c1-8d9f-a879d585ddff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.089210 4783 scope.go:117] "RemoveContainer" containerID="772da3b44ce02bfd6d368e6277bd14f3e0d178f1e6ad5fa25541fe5f3c873631" Jan 31 09:08:05 crc kubenswrapper[4783]: E0131 09:08:05.089895 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"772da3b44ce02bfd6d368e6277bd14f3e0d178f1e6ad5fa25541fe5f3c873631\": container with ID starting with 772da3b44ce02bfd6d368e6277bd14f3e0d178f1e6ad5fa25541fe5f3c873631 not found: ID does not exist" containerID="772da3b44ce02bfd6d368e6277bd14f3e0d178f1e6ad5fa25541fe5f3c873631" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.089950 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"772da3b44ce02bfd6d368e6277bd14f3e0d178f1e6ad5fa25541fe5f3c873631"} err="failed to get container status \"772da3b44ce02bfd6d368e6277bd14f3e0d178f1e6ad5fa25541fe5f3c873631\": rpc error: code = NotFound desc = could not find container \"772da3b44ce02bfd6d368e6277bd14f3e0d178f1e6ad5fa25541fe5f3c873631\": container with ID starting with 772da3b44ce02bfd6d368e6277bd14f3e0d178f1e6ad5fa25541fe5f3c873631 not found: ID does not exist" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.090008 4783 scope.go:117] "RemoveContainer" containerID="3f2b0714331effaade16b4994075e99113926a1c5fbcfb664f01b689a49d3c11" Jan 31 09:08:05 crc kubenswrapper[4783]: E0131 09:08:05.090903 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f2b0714331effaade16b4994075e99113926a1c5fbcfb664f01b689a49d3c11\": container with ID starting with 3f2b0714331effaade16b4994075e99113926a1c5fbcfb664f01b689a49d3c11 not found: ID does not exist" containerID="3f2b0714331effaade16b4994075e99113926a1c5fbcfb664f01b689a49d3c11" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.091044 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2b0714331effaade16b4994075e99113926a1c5fbcfb664f01b689a49d3c11"} err="failed to get container status \"3f2b0714331effaade16b4994075e99113926a1c5fbcfb664f01b689a49d3c11\": rpc error: code = NotFound desc = could not find container \"3f2b0714331effaade16b4994075e99113926a1c5fbcfb664f01b689a49d3c11\": container with ID starting with 3f2b0714331effaade16b4994075e99113926a1c5fbcfb664f01b689a49d3c11 not found: ID does not exist" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.091153 4783 scope.go:117] "RemoveContainer" containerID="50d495110117ea5b3777dfc01ba245a030f62ffb6f0c39fba98e19f16932bc95" Jan 31 09:08:05 crc kubenswrapper[4783]: E0131 09:08:05.091655 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d495110117ea5b3777dfc01ba245a030f62ffb6f0c39fba98e19f16932bc95\": container with ID starting with 50d495110117ea5b3777dfc01ba245a030f62ffb6f0c39fba98e19f16932bc95 not found: ID does not exist" containerID="50d495110117ea5b3777dfc01ba245a030f62ffb6f0c39fba98e19f16932bc95" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.091690 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d495110117ea5b3777dfc01ba245a030f62ffb6f0c39fba98e19f16932bc95"} err="failed to get container status \"50d495110117ea5b3777dfc01ba245a030f62ffb6f0c39fba98e19f16932bc95\": rpc error: code = NotFound desc = could not find container \"50d495110117ea5b3777dfc01ba245a030f62ffb6f0c39fba98e19f16932bc95\": container with ID starting with 50d495110117ea5b3777dfc01ba245a030f62ffb6f0c39fba98e19f16932bc95 not found: ID does not exist" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.091719 4783 scope.go:117] "RemoveContainer" containerID="0e67d147a8d893d6997fa01b19f3b8bf076ac990be50d060cb33ad37612bc2d9" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.109447 4783 scope.go:117] "RemoveContainer" containerID="67ba78abfa334d1feee4bc9ce704cb9df086f133b6e917c39546d094318bbafe" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.124034 4783 scope.go:117] "RemoveContainer" containerID="f819f97d674615c7a8d6abacb2cf72bae2e73c6aac7ca53d1e9bbba4a52793ec" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.128468 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-729tq"] Jan 31 09:08:05 crc kubenswrapper[4783]: W0131 09:08:05.139308 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e7fa19e_aa64_4479_805e_62625ccc19b8.slice/crio-0862cc8c09ff9cdef83bc2c835830066dad73392a739d32bf91e67162e29e267 WatchSource:0}: Error finding container 0862cc8c09ff9cdef83bc2c835830066dad73392a739d32bf91e67162e29e267: Status 404 returned error can't find the container with id 0862cc8c09ff9cdef83bc2c835830066dad73392a739d32bf91e67162e29e267 Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.142767 4783 scope.go:117] "RemoveContainer" containerID="0e67d147a8d893d6997fa01b19f3b8bf076ac990be50d060cb33ad37612bc2d9" Jan 31 09:08:05 crc kubenswrapper[4783]: E0131 09:08:05.143147 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e67d147a8d893d6997fa01b19f3b8bf076ac990be50d060cb33ad37612bc2d9\": container with ID starting with 0e67d147a8d893d6997fa01b19f3b8bf076ac990be50d060cb33ad37612bc2d9 not found: ID does not exist" containerID="0e67d147a8d893d6997fa01b19f3b8bf076ac990be50d060cb33ad37612bc2d9" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.143205 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e67d147a8d893d6997fa01b19f3b8bf076ac990be50d060cb33ad37612bc2d9"} err="failed to get container status \"0e67d147a8d893d6997fa01b19f3b8bf076ac990be50d060cb33ad37612bc2d9\": rpc error: code = NotFound desc = could not find container \"0e67d147a8d893d6997fa01b19f3b8bf076ac990be50d060cb33ad37612bc2d9\": container with ID starting with 0e67d147a8d893d6997fa01b19f3b8bf076ac990be50d060cb33ad37612bc2d9 not found: ID does not exist" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.143247 4783 scope.go:117] "RemoveContainer" containerID="67ba78abfa334d1feee4bc9ce704cb9df086f133b6e917c39546d094318bbafe" Jan 31 09:08:05 crc kubenswrapper[4783]: E0131 09:08:05.143528 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67ba78abfa334d1feee4bc9ce704cb9df086f133b6e917c39546d094318bbafe\": container with ID starting with 67ba78abfa334d1feee4bc9ce704cb9df086f133b6e917c39546d094318bbafe not found: ID does not exist" containerID="67ba78abfa334d1feee4bc9ce704cb9df086f133b6e917c39546d094318bbafe" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.143559 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ba78abfa334d1feee4bc9ce704cb9df086f133b6e917c39546d094318bbafe"} err="failed to get container status \"67ba78abfa334d1feee4bc9ce704cb9df086f133b6e917c39546d094318bbafe\": rpc error: code = NotFound desc = could not find container \"67ba78abfa334d1feee4bc9ce704cb9df086f133b6e917c39546d094318bbafe\": container with ID starting with 67ba78abfa334d1feee4bc9ce704cb9df086f133b6e917c39546d094318bbafe not found: ID does not exist" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.143582 4783 scope.go:117] "RemoveContainer" containerID="f819f97d674615c7a8d6abacb2cf72bae2e73c6aac7ca53d1e9bbba4a52793ec" Jan 31 09:08:05 crc kubenswrapper[4783]: E0131 09:08:05.143804 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f819f97d674615c7a8d6abacb2cf72bae2e73c6aac7ca53d1e9bbba4a52793ec\": container with ID starting with f819f97d674615c7a8d6abacb2cf72bae2e73c6aac7ca53d1e9bbba4a52793ec not found: ID does not exist" containerID="f819f97d674615c7a8d6abacb2cf72bae2e73c6aac7ca53d1e9bbba4a52793ec" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.143829 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f819f97d674615c7a8d6abacb2cf72bae2e73c6aac7ca53d1e9bbba4a52793ec"} err="failed to get container status \"f819f97d674615c7a8d6abacb2cf72bae2e73c6aac7ca53d1e9bbba4a52793ec\": rpc error: code = NotFound desc = could not find container \"f819f97d674615c7a8d6abacb2cf72bae2e73c6aac7ca53d1e9bbba4a52793ec\": container with ID starting with f819f97d674615c7a8d6abacb2cf72bae2e73c6aac7ca53d1e9bbba4a52793ec not found: ID does not exist" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.143843 4783 scope.go:117] "RemoveContainer" containerID="2ced1fefb79907b94a28218e799f02c08457fa9a73f9a38bc290d04b0b61692c" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.147960 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/108dfc0b-86ff-45c1-8d9f-a879d585ddff-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.159570 4783 scope.go:117] "RemoveContainer" containerID="ee6413b350a1188d6f68ee9e4bbf683b6cffc2f3f4aa40ce79cbcd275db9f935" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.174647 4783 scope.go:117] "RemoveContainer" containerID="d204017e839b2f3304e0f312ab2298d72d587a5bfdb5d255e77cc8fcdcbabd0f" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.205182 4783 scope.go:117] "RemoveContainer" containerID="2ced1fefb79907b94a28218e799f02c08457fa9a73f9a38bc290d04b0b61692c" Jan 31 09:08:05 crc kubenswrapper[4783]: E0131 09:08:05.205840 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ced1fefb79907b94a28218e799f02c08457fa9a73f9a38bc290d04b0b61692c\": container with ID starting with 2ced1fefb79907b94a28218e799f02c08457fa9a73f9a38bc290d04b0b61692c not found: ID does not exist" containerID="2ced1fefb79907b94a28218e799f02c08457fa9a73f9a38bc290d04b0b61692c" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.205871 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ced1fefb79907b94a28218e799f02c08457fa9a73f9a38bc290d04b0b61692c"} err="failed to get container status \"2ced1fefb79907b94a28218e799f02c08457fa9a73f9a38bc290d04b0b61692c\": rpc error: code = NotFound desc = could not find container \"2ced1fefb79907b94a28218e799f02c08457fa9a73f9a38bc290d04b0b61692c\": container with ID starting with 2ced1fefb79907b94a28218e799f02c08457fa9a73f9a38bc290d04b0b61692c not found: ID does not exist" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.205890 4783 scope.go:117] "RemoveContainer" containerID="ee6413b350a1188d6f68ee9e4bbf683b6cffc2f3f4aa40ce79cbcd275db9f935" Jan 31 09:08:05 crc kubenswrapper[4783]: E0131 09:08:05.206142 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee6413b350a1188d6f68ee9e4bbf683b6cffc2f3f4aa40ce79cbcd275db9f935\": container with ID starting with ee6413b350a1188d6f68ee9e4bbf683b6cffc2f3f4aa40ce79cbcd275db9f935 not found: ID does not exist" containerID="ee6413b350a1188d6f68ee9e4bbf683b6cffc2f3f4aa40ce79cbcd275db9f935" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.206203 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee6413b350a1188d6f68ee9e4bbf683b6cffc2f3f4aa40ce79cbcd275db9f935"} err="failed to get container status \"ee6413b350a1188d6f68ee9e4bbf683b6cffc2f3f4aa40ce79cbcd275db9f935\": rpc error: code = NotFound desc = could not find container \"ee6413b350a1188d6f68ee9e4bbf683b6cffc2f3f4aa40ce79cbcd275db9f935\": container with ID starting with ee6413b350a1188d6f68ee9e4bbf683b6cffc2f3f4aa40ce79cbcd275db9f935 not found: ID does not exist" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.206224 4783 scope.go:117] "RemoveContainer" containerID="d204017e839b2f3304e0f312ab2298d72d587a5bfdb5d255e77cc8fcdcbabd0f" Jan 31 09:08:05 crc kubenswrapper[4783]: E0131 09:08:05.206552 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d204017e839b2f3304e0f312ab2298d72d587a5bfdb5d255e77cc8fcdcbabd0f\": container with ID starting with d204017e839b2f3304e0f312ab2298d72d587a5bfdb5d255e77cc8fcdcbabd0f not found: ID does not exist" containerID="d204017e839b2f3304e0f312ab2298d72d587a5bfdb5d255e77cc8fcdcbabd0f" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.206575 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d204017e839b2f3304e0f312ab2298d72d587a5bfdb5d255e77cc8fcdcbabd0f"} err="failed to get container status \"d204017e839b2f3304e0f312ab2298d72d587a5bfdb5d255e77cc8fcdcbabd0f\": rpc error: code = NotFound desc = could not find container \"d204017e839b2f3304e0f312ab2298d72d587a5bfdb5d255e77cc8fcdcbabd0f\": container with ID starting with d204017e839b2f3304e0f312ab2298d72d587a5bfdb5d255e77cc8fcdcbabd0f not found: ID does not exist" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.270518 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xxtnp"] Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.277893 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xxtnp"] Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.284664 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lkj9z"] Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.287494 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lkj9z"] Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.292659 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4h5q"] Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.303553 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4h5q"] Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.311244 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2whxk"] Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.320212 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2whxk"] Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.321531 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zzqtt"] Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.323830 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zzqtt"] Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.657146 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="108dfc0b-86ff-45c1-8d9f-a879d585ddff" path="/var/lib/kubelet/pods/108dfc0b-86ff-45c1-8d9f-a879d585ddff/volumes" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.657881 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58cd7170-ffdf-4cf8-9d2c-4e0251ada36e" path="/var/lib/kubelet/pods/58cd7170-ffdf-4cf8-9d2c-4e0251ada36e/volumes" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.658495 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67f8d4b0-8393-4fbc-bb6d-f7d321645e9e" path="/var/lib/kubelet/pods/67f8d4b0-8393-4fbc-bb6d-f7d321645e9e/volumes" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.659575 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cf20dc6-2184-41b1-a943-f917dafb36b4" path="/var/lib/kubelet/pods/7cf20dc6-2184-41b1-a943-f917dafb36b4/volumes" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.660007 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0fca12d-3abf-4543-b5f7-205f4bd75149" path="/var/lib/kubelet/pods/a0fca12d-3abf-4543-b5f7-205f4bd75149/volumes" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.979217 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-729tq" event={"ID":"6e7fa19e-aa64-4479-805e-62625ccc19b8","Type":"ContainerStarted","Data":"e607c8635c5dd717ad1494910fa23df553867e7c78b929f077a9828f4f0d7133"} Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.979291 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-729tq" event={"ID":"6e7fa19e-aa64-4479-805e-62625ccc19b8","Type":"ContainerStarted","Data":"0862cc8c09ff9cdef83bc2c835830066dad73392a739d32bf91e67162e29e267"} Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.979542 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-729tq" Jan 31 09:08:05 crc kubenswrapper[4783]: I0131 09:08:05.985282 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-729tq" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.001793 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-729tq" podStartSLOduration=2.001774891 podStartE2EDuration="2.001774891s" podCreationTimestamp="2026-01-31 09:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:08:05.994814719 +0000 UTC m=+196.663498187" watchObservedRunningTime="2026-01-31 09:08:06.001774891 +0000 UTC m=+196.670458358" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.602803 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xm4z9"] Jan 31 09:08:06 crc kubenswrapper[4783]: E0131 09:08:06.603092 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f8d4b0-8393-4fbc-bb6d-f7d321645e9e" containerName="extract-content" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603107 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f8d4b0-8393-4fbc-bb6d-f7d321645e9e" containerName="extract-content" Jan 31 09:08:06 crc kubenswrapper[4783]: E0131 09:08:06.603120 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cd7170-ffdf-4cf8-9d2c-4e0251ada36e" containerName="extract-content" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603126 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cd7170-ffdf-4cf8-9d2c-4e0251ada36e" containerName="extract-content" Jan 31 09:08:06 crc kubenswrapper[4783]: E0131 09:08:06.603140 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf20dc6-2184-41b1-a943-f917dafb36b4" containerName="marketplace-operator" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603147 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf20dc6-2184-41b1-a943-f917dafb36b4" containerName="marketplace-operator" Jan 31 09:08:06 crc kubenswrapper[4783]: E0131 09:08:06.603155 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cd7170-ffdf-4cf8-9d2c-4e0251ada36e" containerName="extract-utilities" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603176 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cd7170-ffdf-4cf8-9d2c-4e0251ada36e" containerName="extract-utilities" Jan 31 09:08:06 crc kubenswrapper[4783]: E0131 09:08:06.603184 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f8d4b0-8393-4fbc-bb6d-f7d321645e9e" containerName="extract-utilities" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603190 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f8d4b0-8393-4fbc-bb6d-f7d321645e9e" containerName="extract-utilities" Jan 31 09:08:06 crc kubenswrapper[4783]: E0131 09:08:06.603198 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0fca12d-3abf-4543-b5f7-205f4bd75149" containerName="extract-content" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603207 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0fca12d-3abf-4543-b5f7-205f4bd75149" containerName="extract-content" Jan 31 09:08:06 crc kubenswrapper[4783]: E0131 09:08:06.603216 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f8d4b0-8393-4fbc-bb6d-f7d321645e9e" containerName="registry-server" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603221 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f8d4b0-8393-4fbc-bb6d-f7d321645e9e" containerName="registry-server" Jan 31 09:08:06 crc kubenswrapper[4783]: E0131 09:08:06.603238 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0fca12d-3abf-4543-b5f7-205f4bd75149" containerName="registry-server" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603243 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0fca12d-3abf-4543-b5f7-205f4bd75149" containerName="registry-server" Jan 31 09:08:06 crc kubenswrapper[4783]: E0131 09:08:06.603251 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cd7170-ffdf-4cf8-9d2c-4e0251ada36e" containerName="registry-server" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603257 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cd7170-ffdf-4cf8-9d2c-4e0251ada36e" containerName="registry-server" Jan 31 09:08:06 crc kubenswrapper[4783]: E0131 09:08:06.603268 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0fca12d-3abf-4543-b5f7-205f4bd75149" containerName="extract-utilities" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603275 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0fca12d-3abf-4543-b5f7-205f4bd75149" containerName="extract-utilities" Jan 31 09:08:06 crc kubenswrapper[4783]: E0131 09:08:06.603283 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="108dfc0b-86ff-45c1-8d9f-a879d585ddff" containerName="registry-server" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603290 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="108dfc0b-86ff-45c1-8d9f-a879d585ddff" containerName="registry-server" Jan 31 09:08:06 crc kubenswrapper[4783]: E0131 09:08:06.603300 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="108dfc0b-86ff-45c1-8d9f-a879d585ddff" containerName="extract-utilities" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603306 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="108dfc0b-86ff-45c1-8d9f-a879d585ddff" containerName="extract-utilities" Jan 31 09:08:06 crc kubenswrapper[4783]: E0131 09:08:06.603315 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="108dfc0b-86ff-45c1-8d9f-a879d585ddff" containerName="extract-content" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603321 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="108dfc0b-86ff-45c1-8d9f-a879d585ddff" containerName="extract-content" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603449 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0fca12d-3abf-4543-b5f7-205f4bd75149" containerName="registry-server" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603459 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="58cd7170-ffdf-4cf8-9d2c-4e0251ada36e" containerName="registry-server" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603470 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="108dfc0b-86ff-45c1-8d9f-a879d585ddff" containerName="registry-server" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603478 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="67f8d4b0-8393-4fbc-bb6d-f7d321645e9e" containerName="registry-server" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.603488 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf20dc6-2184-41b1-a943-f917dafb36b4" containerName="marketplace-operator" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.604389 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xm4z9" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.606715 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.617341 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xm4z9"] Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.765228 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7cddcda-dcff-4c7a-b437-2bd3750b9200-catalog-content\") pod \"redhat-marketplace-xm4z9\" (UID: \"c7cddcda-dcff-4c7a-b437-2bd3750b9200\") " pod="openshift-marketplace/redhat-marketplace-xm4z9" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.765369 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7cddcda-dcff-4c7a-b437-2bd3750b9200-utilities\") pod \"redhat-marketplace-xm4z9\" (UID: \"c7cddcda-dcff-4c7a-b437-2bd3750b9200\") " pod="openshift-marketplace/redhat-marketplace-xm4z9" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.765519 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kf9f\" (UniqueName: \"kubernetes.io/projected/c7cddcda-dcff-4c7a-b437-2bd3750b9200-kube-api-access-6kf9f\") pod \"redhat-marketplace-xm4z9\" (UID: \"c7cddcda-dcff-4c7a-b437-2bd3750b9200\") " pod="openshift-marketplace/redhat-marketplace-xm4z9" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.800350 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-729jp"] Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.801878 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-729jp" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.805191 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.816820 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-729jp"] Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.867101 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7h6q\" (UniqueName: \"kubernetes.io/projected/2b509e05-1b13-486b-8986-6a343c3110b8-kube-api-access-t7h6q\") pod \"community-operators-729jp\" (UID: \"2b509e05-1b13-486b-8986-6a343c3110b8\") " pod="openshift-marketplace/community-operators-729jp" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.867279 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b509e05-1b13-486b-8986-6a343c3110b8-catalog-content\") pod \"community-operators-729jp\" (UID: \"2b509e05-1b13-486b-8986-6a343c3110b8\") " pod="openshift-marketplace/community-operators-729jp" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.867362 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kf9f\" (UniqueName: \"kubernetes.io/projected/c7cddcda-dcff-4c7a-b437-2bd3750b9200-kube-api-access-6kf9f\") pod \"redhat-marketplace-xm4z9\" (UID: \"c7cddcda-dcff-4c7a-b437-2bd3750b9200\") " pod="openshift-marketplace/redhat-marketplace-xm4z9" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.867538 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7cddcda-dcff-4c7a-b437-2bd3750b9200-catalog-content\") pod \"redhat-marketplace-xm4z9\" (UID: \"c7cddcda-dcff-4c7a-b437-2bd3750b9200\") " pod="openshift-marketplace/redhat-marketplace-xm4z9" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.867587 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7cddcda-dcff-4c7a-b437-2bd3750b9200-utilities\") pod \"redhat-marketplace-xm4z9\" (UID: \"c7cddcda-dcff-4c7a-b437-2bd3750b9200\") " pod="openshift-marketplace/redhat-marketplace-xm4z9" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.867635 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b509e05-1b13-486b-8986-6a343c3110b8-utilities\") pod \"community-operators-729jp\" (UID: \"2b509e05-1b13-486b-8986-6a343c3110b8\") " pod="openshift-marketplace/community-operators-729jp" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.868151 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7cddcda-dcff-4c7a-b437-2bd3750b9200-catalog-content\") pod \"redhat-marketplace-xm4z9\" (UID: \"c7cddcda-dcff-4c7a-b437-2bd3750b9200\") " pod="openshift-marketplace/redhat-marketplace-xm4z9" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.868803 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7cddcda-dcff-4c7a-b437-2bd3750b9200-utilities\") pod \"redhat-marketplace-xm4z9\" (UID: \"c7cddcda-dcff-4c7a-b437-2bd3750b9200\") " pod="openshift-marketplace/redhat-marketplace-xm4z9" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.886466 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kf9f\" (UniqueName: \"kubernetes.io/projected/c7cddcda-dcff-4c7a-b437-2bd3750b9200-kube-api-access-6kf9f\") pod \"redhat-marketplace-xm4z9\" (UID: \"c7cddcda-dcff-4c7a-b437-2bd3750b9200\") " pod="openshift-marketplace/redhat-marketplace-xm4z9" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.922820 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xm4z9" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.969005 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7h6q\" (UniqueName: \"kubernetes.io/projected/2b509e05-1b13-486b-8986-6a343c3110b8-kube-api-access-t7h6q\") pod \"community-operators-729jp\" (UID: \"2b509e05-1b13-486b-8986-6a343c3110b8\") " pod="openshift-marketplace/community-operators-729jp" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.969383 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b509e05-1b13-486b-8986-6a343c3110b8-catalog-content\") pod \"community-operators-729jp\" (UID: \"2b509e05-1b13-486b-8986-6a343c3110b8\") " pod="openshift-marketplace/community-operators-729jp" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.969465 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b509e05-1b13-486b-8986-6a343c3110b8-utilities\") pod \"community-operators-729jp\" (UID: \"2b509e05-1b13-486b-8986-6a343c3110b8\") " pod="openshift-marketplace/community-operators-729jp" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.969924 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b509e05-1b13-486b-8986-6a343c3110b8-catalog-content\") pod \"community-operators-729jp\" (UID: \"2b509e05-1b13-486b-8986-6a343c3110b8\") " pod="openshift-marketplace/community-operators-729jp" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.970092 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b509e05-1b13-486b-8986-6a343c3110b8-utilities\") pod \"community-operators-729jp\" (UID: \"2b509e05-1b13-486b-8986-6a343c3110b8\") " pod="openshift-marketplace/community-operators-729jp" Jan 31 09:08:06 crc kubenswrapper[4783]: I0131 09:08:06.985866 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7h6q\" (UniqueName: \"kubernetes.io/projected/2b509e05-1b13-486b-8986-6a343c3110b8-kube-api-access-t7h6q\") pod \"community-operators-729jp\" (UID: \"2b509e05-1b13-486b-8986-6a343c3110b8\") " pod="openshift-marketplace/community-operators-729jp" Jan 31 09:08:07 crc kubenswrapper[4783]: I0131 09:08:07.118259 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-729jp" Jan 31 09:08:07 crc kubenswrapper[4783]: I0131 09:08:07.313273 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xm4z9"] Jan 31 09:08:07 crc kubenswrapper[4783]: I0131 09:08:07.502066 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-729jp"] Jan 31 09:08:07 crc kubenswrapper[4783]: W0131 09:08:07.525782 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b509e05_1b13_486b_8986_6a343c3110b8.slice/crio-1067cac758ca4edf32a89503a3f960c7937fd444c1e02da9f15b587d72ce8a72 WatchSource:0}: Error finding container 1067cac758ca4edf32a89503a3f960c7937fd444c1e02da9f15b587d72ce8a72: Status 404 returned error can't find the container with id 1067cac758ca4edf32a89503a3f960c7937fd444c1e02da9f15b587d72ce8a72 Jan 31 09:08:07 crc kubenswrapper[4783]: I0131 09:08:07.999196 4783 generic.go:334] "Generic (PLEG): container finished" podID="c7cddcda-dcff-4c7a-b437-2bd3750b9200" containerID="8639710f360f286b88777fd6ebe41f6aec78334e631f34658cc6b5316a52b84c" exitCode=0 Jan 31 09:08:07 crc kubenswrapper[4783]: I0131 09:08:07.999354 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xm4z9" event={"ID":"c7cddcda-dcff-4c7a-b437-2bd3750b9200","Type":"ContainerDied","Data":"8639710f360f286b88777fd6ebe41f6aec78334e631f34658cc6b5316a52b84c"} Jan 31 09:08:07 crc kubenswrapper[4783]: I0131 09:08:07.999513 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xm4z9" event={"ID":"c7cddcda-dcff-4c7a-b437-2bd3750b9200","Type":"ContainerStarted","Data":"eec61141ac19e439e55f0e7d6ae155b04276522907f3c426bc3bc7abcc492c3f"} Jan 31 09:08:08 crc kubenswrapper[4783]: I0131 09:08:08.002693 4783 generic.go:334] "Generic (PLEG): container finished" podID="2b509e05-1b13-486b-8986-6a343c3110b8" containerID="8a60cb743bd5a7772605dea8df2c697e743795fca766e0a34a88c9e369700a89" exitCode=0 Jan 31 09:08:08 crc kubenswrapper[4783]: I0131 09:08:08.002751 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-729jp" event={"ID":"2b509e05-1b13-486b-8986-6a343c3110b8","Type":"ContainerDied","Data":"8a60cb743bd5a7772605dea8df2c697e743795fca766e0a34a88c9e369700a89"} Jan 31 09:08:08 crc kubenswrapper[4783]: I0131 09:08:08.002775 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-729jp" event={"ID":"2b509e05-1b13-486b-8986-6a343c3110b8","Type":"ContainerStarted","Data":"1067cac758ca4edf32a89503a3f960c7937fd444c1e02da9f15b587d72ce8a72"} Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.002144 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5dqb2"] Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.003365 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dqb2" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.004951 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.008959 4783 generic.go:334] "Generic (PLEG): container finished" podID="2b509e05-1b13-486b-8986-6a343c3110b8" containerID="36d933b973b61c00e0e68ace50ec39a883120bfd427b7e8dedfe2eb7afc153f5" exitCode=0 Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.009006 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-729jp" event={"ID":"2b509e05-1b13-486b-8986-6a343c3110b8","Type":"ContainerDied","Data":"36d933b973b61c00e0e68ace50ec39a883120bfd427b7e8dedfe2eb7afc153f5"} Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.010574 4783 generic.go:334] "Generic (PLEG): container finished" podID="c7cddcda-dcff-4c7a-b437-2bd3750b9200" containerID="1309b72f68fe80a8fb4d96cc24dc45c5facf3c46848b1bcf00bc359729b1afc9" exitCode=0 Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.010611 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xm4z9" event={"ID":"c7cddcda-dcff-4c7a-b437-2bd3750b9200","Type":"ContainerDied","Data":"1309b72f68fe80a8fb4d96cc24dc45c5facf3c46848b1bcf00bc359729b1afc9"} Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.016035 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5dqb2"] Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.096029 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2-utilities\") pod \"certified-operators-5dqb2\" (UID: \"fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2\") " pod="openshift-marketplace/certified-operators-5dqb2" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.096072 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ftz7\" (UniqueName: \"kubernetes.io/projected/fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2-kube-api-access-4ftz7\") pod \"certified-operators-5dqb2\" (UID: \"fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2\") " pod="openshift-marketplace/certified-operators-5dqb2" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.096151 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2-catalog-content\") pod \"certified-operators-5dqb2\" (UID: \"fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2\") " pod="openshift-marketplace/certified-operators-5dqb2" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.197565 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2-utilities\") pod \"certified-operators-5dqb2\" (UID: \"fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2\") " pod="openshift-marketplace/certified-operators-5dqb2" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.197631 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ftz7\" (UniqueName: \"kubernetes.io/projected/fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2-kube-api-access-4ftz7\") pod \"certified-operators-5dqb2\" (UID: \"fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2\") " pod="openshift-marketplace/certified-operators-5dqb2" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.197696 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2-catalog-content\") pod \"certified-operators-5dqb2\" (UID: \"fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2\") " pod="openshift-marketplace/certified-operators-5dqb2" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.198271 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2-catalog-content\") pod \"certified-operators-5dqb2\" (UID: \"fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2\") " pod="openshift-marketplace/certified-operators-5dqb2" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.198526 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2-utilities\") pod \"certified-operators-5dqb2\" (UID: \"fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2\") " pod="openshift-marketplace/certified-operators-5dqb2" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.199392 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4fjj8"] Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.200668 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4fjj8" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.203666 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.208878 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4fjj8"] Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.221570 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ftz7\" (UniqueName: \"kubernetes.io/projected/fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2-kube-api-access-4ftz7\") pod \"certified-operators-5dqb2\" (UID: \"fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2\") " pod="openshift-marketplace/certified-operators-5dqb2" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.299013 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ccd026-7c4f-4a84-8baa-45cfafa1abba-catalog-content\") pod \"redhat-operators-4fjj8\" (UID: \"86ccd026-7c4f-4a84-8baa-45cfafa1abba\") " pod="openshift-marketplace/redhat-operators-4fjj8" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.299084 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ccd026-7c4f-4a84-8baa-45cfafa1abba-utilities\") pod \"redhat-operators-4fjj8\" (UID: \"86ccd026-7c4f-4a84-8baa-45cfafa1abba\") " pod="openshift-marketplace/redhat-operators-4fjj8" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.299211 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp6vn\" (UniqueName: \"kubernetes.io/projected/86ccd026-7c4f-4a84-8baa-45cfafa1abba-kube-api-access-lp6vn\") pod \"redhat-operators-4fjj8\" (UID: \"86ccd026-7c4f-4a84-8baa-45cfafa1abba\") " pod="openshift-marketplace/redhat-operators-4fjj8" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.346777 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dqb2" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.400637 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ccd026-7c4f-4a84-8baa-45cfafa1abba-utilities\") pod \"redhat-operators-4fjj8\" (UID: \"86ccd026-7c4f-4a84-8baa-45cfafa1abba\") " pod="openshift-marketplace/redhat-operators-4fjj8" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.400927 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp6vn\" (UniqueName: \"kubernetes.io/projected/86ccd026-7c4f-4a84-8baa-45cfafa1abba-kube-api-access-lp6vn\") pod \"redhat-operators-4fjj8\" (UID: \"86ccd026-7c4f-4a84-8baa-45cfafa1abba\") " pod="openshift-marketplace/redhat-operators-4fjj8" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.400994 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ccd026-7c4f-4a84-8baa-45cfafa1abba-catalog-content\") pod \"redhat-operators-4fjj8\" (UID: \"86ccd026-7c4f-4a84-8baa-45cfafa1abba\") " pod="openshift-marketplace/redhat-operators-4fjj8" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.401139 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ccd026-7c4f-4a84-8baa-45cfafa1abba-utilities\") pod \"redhat-operators-4fjj8\" (UID: \"86ccd026-7c4f-4a84-8baa-45cfafa1abba\") " pod="openshift-marketplace/redhat-operators-4fjj8" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.401482 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ccd026-7c4f-4a84-8baa-45cfafa1abba-catalog-content\") pod \"redhat-operators-4fjj8\" (UID: \"86ccd026-7c4f-4a84-8baa-45cfafa1abba\") " pod="openshift-marketplace/redhat-operators-4fjj8" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.420946 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp6vn\" (UniqueName: \"kubernetes.io/projected/86ccd026-7c4f-4a84-8baa-45cfafa1abba-kube-api-access-lp6vn\") pod \"redhat-operators-4fjj8\" (UID: \"86ccd026-7c4f-4a84-8baa-45cfafa1abba\") " pod="openshift-marketplace/redhat-operators-4fjj8" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.523874 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4fjj8" Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.707845 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4fjj8"] Jan 31 09:08:09 crc kubenswrapper[4783]: W0131 09:08:09.721182 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86ccd026_7c4f_4a84_8baa_45cfafa1abba.slice/crio-186a79a372dc725dc55464404bb917537c3ccc720fd7ea1dc174073c7254336a WatchSource:0}: Error finding container 186a79a372dc725dc55464404bb917537c3ccc720fd7ea1dc174073c7254336a: Status 404 returned error can't find the container with id 186a79a372dc725dc55464404bb917537c3ccc720fd7ea1dc174073c7254336a Jan 31 09:08:09 crc kubenswrapper[4783]: I0131 09:08:09.724477 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5dqb2"] Jan 31 09:08:09 crc kubenswrapper[4783]: W0131 09:08:09.736253 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe131d5f_6dbb_43d5_8ff0_63b8d9e901a2.slice/crio-ec4a787383a67a93fd085cb08abef41bee3c8fe68279e389ece1a28070799361 WatchSource:0}: Error finding container ec4a787383a67a93fd085cb08abef41bee3c8fe68279e389ece1a28070799361: Status 404 returned error can't find the container with id ec4a787383a67a93fd085cb08abef41bee3c8fe68279e389ece1a28070799361 Jan 31 09:08:10 crc kubenswrapper[4783]: I0131 09:08:10.019566 4783 generic.go:334] "Generic (PLEG): container finished" podID="86ccd026-7c4f-4a84-8baa-45cfafa1abba" containerID="034ca8571fc0bc757589c354a460eb5fca16080d123da399f6a0ae63152ce012" exitCode=0 Jan 31 09:08:10 crc kubenswrapper[4783]: I0131 09:08:10.019665 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fjj8" event={"ID":"86ccd026-7c4f-4a84-8baa-45cfafa1abba","Type":"ContainerDied","Data":"034ca8571fc0bc757589c354a460eb5fca16080d123da399f6a0ae63152ce012"} Jan 31 09:08:10 crc kubenswrapper[4783]: I0131 09:08:10.020046 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fjj8" event={"ID":"86ccd026-7c4f-4a84-8baa-45cfafa1abba","Type":"ContainerStarted","Data":"186a79a372dc725dc55464404bb917537c3ccc720fd7ea1dc174073c7254336a"} Jan 31 09:08:10 crc kubenswrapper[4783]: I0131 09:08:10.023297 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xm4z9" event={"ID":"c7cddcda-dcff-4c7a-b437-2bd3750b9200","Type":"ContainerStarted","Data":"80c3072d4a8fc7f8e315d14e8e4c3c5d66695dfd7d03e7f98f73024dc6c090ff"} Jan 31 09:08:10 crc kubenswrapper[4783]: I0131 09:08:10.025771 4783 generic.go:334] "Generic (PLEG): container finished" podID="fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2" containerID="f330c3560c4e203f517273b887e3342b55f3197b9a39447ad6ad9a93be190288" exitCode=0 Jan 31 09:08:10 crc kubenswrapper[4783]: I0131 09:08:10.025852 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dqb2" event={"ID":"fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2","Type":"ContainerDied","Data":"f330c3560c4e203f517273b887e3342b55f3197b9a39447ad6ad9a93be190288"} Jan 31 09:08:10 crc kubenswrapper[4783]: I0131 09:08:10.025885 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dqb2" event={"ID":"fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2","Type":"ContainerStarted","Data":"ec4a787383a67a93fd085cb08abef41bee3c8fe68279e389ece1a28070799361"} Jan 31 09:08:10 crc kubenswrapper[4783]: I0131 09:08:10.029050 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-729jp" event={"ID":"2b509e05-1b13-486b-8986-6a343c3110b8","Type":"ContainerStarted","Data":"b83d79dbd41bd2dd4066d0ca3fdc77241be5b3eb4a56e76e11e261991622bf26"} Jan 31 09:08:10 crc kubenswrapper[4783]: I0131 09:08:10.088309 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xm4z9" podStartSLOduration=2.346321369 podStartE2EDuration="4.088282112s" podCreationTimestamp="2026-01-31 09:08:06 +0000 UTC" firstStartedPulling="2026-01-31 09:08:08.000710172 +0000 UTC m=+198.669393650" lastFinishedPulling="2026-01-31 09:08:09.742670925 +0000 UTC m=+200.411354393" observedRunningTime="2026-01-31 09:08:10.086717528 +0000 UTC m=+200.755400996" watchObservedRunningTime="2026-01-31 09:08:10.088282112 +0000 UTC m=+200.756965580" Jan 31 09:08:10 crc kubenswrapper[4783]: I0131 09:08:10.103279 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-729jp" podStartSLOduration=2.548920383 podStartE2EDuration="4.103253455s" podCreationTimestamp="2026-01-31 09:08:06 +0000 UTC" firstStartedPulling="2026-01-31 09:08:08.004008449 +0000 UTC m=+198.672691918" lastFinishedPulling="2026-01-31 09:08:09.558341522 +0000 UTC m=+200.227024990" observedRunningTime="2026-01-31 09:08:10.100449198 +0000 UTC m=+200.769132666" watchObservedRunningTime="2026-01-31 09:08:10.103253455 +0000 UTC m=+200.771936923" Jan 31 09:08:11 crc kubenswrapper[4783]: I0131 09:08:11.044825 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fjj8" event={"ID":"86ccd026-7c4f-4a84-8baa-45cfafa1abba","Type":"ContainerStarted","Data":"8878ded579f4d739c753ecdb8c2a26cce6f29f3806c4160165aee8956b3b13b7"} Jan 31 09:08:11 crc kubenswrapper[4783]: I0131 09:08:11.048677 4783 generic.go:334] "Generic (PLEG): container finished" podID="fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2" containerID="a95163db2b72d13604db2372e4dcbd1541bdf019c51af1bd9fda0d355ac6a687" exitCode=0 Jan 31 09:08:11 crc kubenswrapper[4783]: I0131 09:08:11.048726 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dqb2" event={"ID":"fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2","Type":"ContainerDied","Data":"a95163db2b72d13604db2372e4dcbd1541bdf019c51af1bd9fda0d355ac6a687"} Jan 31 09:08:12 crc kubenswrapper[4783]: I0131 09:08:12.056244 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dqb2" event={"ID":"fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2","Type":"ContainerStarted","Data":"7e82555f30bbe7ca976ed5df0db94e450c0138b7b609fdb48eec1fb69c2eb3e1"} Jan 31 09:08:12 crc kubenswrapper[4783]: I0131 09:08:12.058415 4783 generic.go:334] "Generic (PLEG): container finished" podID="86ccd026-7c4f-4a84-8baa-45cfafa1abba" containerID="8878ded579f4d739c753ecdb8c2a26cce6f29f3806c4160165aee8956b3b13b7" exitCode=0 Jan 31 09:08:12 crc kubenswrapper[4783]: I0131 09:08:12.058451 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fjj8" event={"ID":"86ccd026-7c4f-4a84-8baa-45cfafa1abba","Type":"ContainerDied","Data":"8878ded579f4d739c753ecdb8c2a26cce6f29f3806c4160165aee8956b3b13b7"} Jan 31 09:08:12 crc kubenswrapper[4783]: I0131 09:08:12.087158 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5dqb2" podStartSLOduration=2.470813133 podStartE2EDuration="4.08713792s" podCreationTimestamp="2026-01-31 09:08:08 +0000 UTC" firstStartedPulling="2026-01-31 09:08:10.027336066 +0000 UTC m=+200.696019534" lastFinishedPulling="2026-01-31 09:08:11.643660852 +0000 UTC m=+202.312344321" observedRunningTime="2026-01-31 09:08:12.073082921 +0000 UTC m=+202.741766389" watchObservedRunningTime="2026-01-31 09:08:12.08713792 +0000 UTC m=+202.755821388" Jan 31 09:08:14 crc kubenswrapper[4783]: I0131 09:08:14.072751 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fjj8" event={"ID":"86ccd026-7c4f-4a84-8baa-45cfafa1abba","Type":"ContainerStarted","Data":"c8d74a9a1693173387d334000455f5ff4ab5cbace9a85d62d2a3911bd77096e6"} Jan 31 09:08:14 crc kubenswrapper[4783]: I0131 09:08:14.087439 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4fjj8" podStartSLOduration=2.561351585 podStartE2EDuration="5.087427505s" podCreationTimestamp="2026-01-31 09:08:09 +0000 UTC" firstStartedPulling="2026-01-31 09:08:10.021362767 +0000 UTC m=+200.690046234" lastFinishedPulling="2026-01-31 09:08:12.547438676 +0000 UTC m=+203.216122154" observedRunningTime="2026-01-31 09:08:14.086383 +0000 UTC m=+204.755066468" watchObservedRunningTime="2026-01-31 09:08:14.087427505 +0000 UTC m=+204.756110973" Jan 31 09:08:16 crc kubenswrapper[4783]: I0131 09:08:16.923433 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xm4z9" Jan 31 09:08:16 crc kubenswrapper[4783]: I0131 09:08:16.923819 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xm4z9" Jan 31 09:08:16 crc kubenswrapper[4783]: I0131 09:08:16.961407 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xm4z9" Jan 31 09:08:17 crc kubenswrapper[4783]: I0131 09:08:17.118948 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-729jp" Jan 31 09:08:17 crc kubenswrapper[4783]: I0131 09:08:17.118999 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-729jp" Jan 31 09:08:17 crc kubenswrapper[4783]: I0131 09:08:17.120518 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xm4z9" Jan 31 09:08:17 crc kubenswrapper[4783]: I0131 09:08:17.158621 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-729jp" Jan 31 09:08:17 crc kubenswrapper[4783]: I0131 09:08:17.756668 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:08:17 crc kubenswrapper[4783]: I0131 09:08:17.756769 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:08:17 crc kubenswrapper[4783]: I0131 09:08:17.756866 4783 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:08:17 crc kubenswrapper[4783]: I0131 09:08:17.757878 4783 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c"} pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:08:17 crc kubenswrapper[4783]: I0131 09:08:17.757926 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" containerID="cri-o://e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c" gracePeriod=600 Jan 31 09:08:18 crc kubenswrapper[4783]: I0131 09:08:18.098119 4783 generic.go:334] "Generic (PLEG): container finished" podID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerID="e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c" exitCode=0 Jan 31 09:08:18 crc kubenswrapper[4783]: I0131 09:08:18.098211 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerDied","Data":"e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c"} Jan 31 09:08:18 crc kubenswrapper[4783]: I0131 09:08:18.098547 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerStarted","Data":"eb0c7fd7fa4ed1c1e3f1dc52fb6d93f057aa5a1f9ffa937b84cf1761c03b046a"} Jan 31 09:08:18 crc kubenswrapper[4783]: I0131 09:08:18.132908 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-729jp" Jan 31 09:08:19 crc kubenswrapper[4783]: I0131 09:08:19.347374 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5dqb2" Jan 31 09:08:19 crc kubenswrapper[4783]: I0131 09:08:19.347895 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5dqb2" Jan 31 09:08:19 crc kubenswrapper[4783]: I0131 09:08:19.398779 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5dqb2" Jan 31 09:08:19 crc kubenswrapper[4783]: I0131 09:08:19.524474 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4fjj8" Jan 31 09:08:19 crc kubenswrapper[4783]: I0131 09:08:19.524734 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4fjj8" Jan 31 09:08:19 crc kubenswrapper[4783]: I0131 09:08:19.562901 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4fjj8" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.143566 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5dqb2" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.143633 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4fjj8" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.330977 4783 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.331990 4783 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.332204 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.332361 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357" gracePeriod=15 Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.332405 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406" gracePeriod=15 Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.332442 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7" gracePeriod=15 Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.332510 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410" gracePeriod=15 Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.332558 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f" gracePeriod=15 Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.333732 4783 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 09:08:20 crc kubenswrapper[4783]: E0131 09:08:20.333928 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.333943 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 31 09:08:20 crc kubenswrapper[4783]: E0131 09:08:20.333964 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.333970 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:08:20 crc kubenswrapper[4783]: E0131 09:08:20.333983 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.333989 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 31 09:08:20 crc kubenswrapper[4783]: E0131 09:08:20.334000 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.334008 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 31 09:08:20 crc kubenswrapper[4783]: E0131 09:08:20.334016 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.334022 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 31 09:08:20 crc kubenswrapper[4783]: E0131 09:08:20.334031 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.334036 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.334139 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.334151 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.334179 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.334188 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.334197 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.334206 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:08:20 crc kubenswrapper[4783]: E0131 09:08:20.334355 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.334363 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.363129 4783 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]log ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]api-openshift-apiserver-available ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]api-openshift-oauth-apiserver-available ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]informer-sync ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/openshift.io-api-request-count-filter ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/openshift.io-startkubeinformers ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/generic-apiserver-start-informers ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/priority-and-fairness-config-consumer ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/priority-and-fairness-filter ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/start-apiextensions-informers ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/start-apiextensions-controllers ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/crd-informer-synced ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/start-system-namespaces-controller ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/start-cluster-authentication-info-controller ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/start-legacy-token-tracking-controller ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/start-service-ip-repair-controllers ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/rbac/bootstrap-roles ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/priority-and-fairness-config-producer ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/bootstrap-controller ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/start-kube-aggregator-informers ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/apiservice-status-local-available-controller ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/apiservice-status-remote-available-controller ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/apiservice-registration-controller ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/apiservice-wait-for-first-sync ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/apiservice-discovery-controller ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/kube-apiserver-autoregistration ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]autoregister-completion ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/apiservice-openapi-controller ok Jan 31 09:08:20 crc kubenswrapper[4783]: [+]poststarthook/apiservice-openapiv3-controller ok Jan 31 09:08:20 crc kubenswrapper[4783]: [-]shutdown failed: reason withheld Jan 31 09:08:20 crc kubenswrapper[4783]: readyz check failed Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.363200 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.461691 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.461742 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.461763 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.461782 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.461798 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.461840 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.461960 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.462048 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.562996 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.563048 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.563076 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.563097 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.563112 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.563155 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.563191 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.563210 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.563290 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.563343 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.563365 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.563386 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.563407 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.563425 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.563444 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:20 crc kubenswrapper[4783]: I0131 09:08:20.563465 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:21 crc kubenswrapper[4783]: I0131 09:08:21.115925 4783 generic.go:334] "Generic (PLEG): container finished" podID="4ecd2b3c-2ff6-4a90-b525-e262382bd09f" containerID="0b4a55938dbc01bc8c33bc7b0ab07099028fd4199c59f764d0763efec34e7a17" exitCode=0 Jan 31 09:08:21 crc kubenswrapper[4783]: I0131 09:08:21.116034 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4ecd2b3c-2ff6-4a90-b525-e262382bd09f","Type":"ContainerDied","Data":"0b4a55938dbc01bc8c33bc7b0ab07099028fd4199c59f764d0763efec34e7a17"} Jan 31 09:08:21 crc kubenswrapper[4783]: I0131 09:08:21.116936 4783 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:21 crc kubenswrapper[4783]: I0131 09:08:21.117372 4783 status_manager.go:851] "Failed to get status for pod" podUID="4ecd2b3c-2ff6-4a90-b525-e262382bd09f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:21 crc kubenswrapper[4783]: I0131 09:08:21.119532 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 31 09:08:21 crc kubenswrapper[4783]: I0131 09:08:21.120761 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 09:08:21 crc kubenswrapper[4783]: I0131 09:08:21.121463 4783 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f" exitCode=0 Jan 31 09:08:21 crc kubenswrapper[4783]: I0131 09:08:21.121492 4783 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406" exitCode=0 Jan 31 09:08:21 crc kubenswrapper[4783]: I0131 09:08:21.121503 4783 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7" exitCode=0 Jan 31 09:08:21 crc kubenswrapper[4783]: I0131 09:08:21.121513 4783 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410" exitCode=2 Jan 31 09:08:21 crc kubenswrapper[4783]: I0131 09:08:21.121712 4783 scope.go:117] "RemoveContainer" containerID="4661d603005fd44f6c113cf95eec92500c578795e9a7d755312dc9ffbc1cb69b" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.132643 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.358469 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.359131 4783 status_manager.go:851] "Failed to get status for pod" podUID="4ecd2b3c-2ff6-4a90-b525-e262382bd09f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.385132 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-var-lock\") pod \"4ecd2b3c-2ff6-4a90-b525-e262382bd09f\" (UID: \"4ecd2b3c-2ff6-4a90-b525-e262382bd09f\") " Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.385185 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-kube-api-access\") pod \"4ecd2b3c-2ff6-4a90-b525-e262382bd09f\" (UID: \"4ecd2b3c-2ff6-4a90-b525-e262382bd09f\") " Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.385243 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-kubelet-dir\") pod \"4ecd2b3c-2ff6-4a90-b525-e262382bd09f\" (UID: \"4ecd2b3c-2ff6-4a90-b525-e262382bd09f\") " Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.385450 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4ecd2b3c-2ff6-4a90-b525-e262382bd09f" (UID: "4ecd2b3c-2ff6-4a90-b525-e262382bd09f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.386278 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-var-lock" (OuterVolumeSpecName: "var-lock") pod "4ecd2b3c-2ff6-4a90-b525-e262382bd09f" (UID: "4ecd2b3c-2ff6-4a90-b525-e262382bd09f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.393207 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4ecd2b3c-2ff6-4a90-b525-e262382bd09f" (UID: "4ecd2b3c-2ff6-4a90-b525-e262382bd09f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.486440 4783 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-var-lock\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.486957 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.487021 4783 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ecd2b3c-2ff6-4a90-b525-e262382bd09f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.691852 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.692573 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.693013 4783 status_manager.go:851] "Failed to get status for pod" podUID="4ecd2b3c-2ff6-4a90-b525-e262382bd09f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.693213 4783 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.891619 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.891727 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.892015 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.892095 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.892239 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.892238 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.892765 4783 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.892846 4783 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:22 crc kubenswrapper[4783]: I0131 09:08:22.892908 4783 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.143235 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.143893 4783 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357" exitCode=0 Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.143974 4783 scope.go:117] "RemoveContainer" containerID="fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.144094 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.148365 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4ecd2b3c-2ff6-4a90-b525-e262382bd09f","Type":"ContainerDied","Data":"56f865689e232d3f110b0bb7a2b0ae8f888f8e476c1b5c1557c5531e81ec11d3"} Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.148401 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.148441 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56f865689e232d3f110b0bb7a2b0ae8f888f8e476c1b5c1557c5531e81ec11d3" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.156696 4783 status_manager.go:851] "Failed to get status for pod" podUID="4ecd2b3c-2ff6-4a90-b525-e262382bd09f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.157181 4783 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.160434 4783 status_manager.go:851] "Failed to get status for pod" podUID="4ecd2b3c-2ff6-4a90-b525-e262382bd09f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.160658 4783 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.161377 4783 scope.go:117] "RemoveContainer" containerID="f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.172481 4783 scope.go:117] "RemoveContainer" containerID="c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.189837 4783 scope.go:117] "RemoveContainer" containerID="53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.203487 4783 scope.go:117] "RemoveContainer" containerID="8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.215645 4783 scope.go:117] "RemoveContainer" containerID="c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.234486 4783 scope.go:117] "RemoveContainer" containerID="fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f" Jan 31 09:08:23 crc kubenswrapper[4783]: E0131 09:08:23.234841 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\": container with ID starting with fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f not found: ID does not exist" containerID="fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.234879 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f"} err="failed to get container status \"fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\": rpc error: code = NotFound desc = could not find container \"fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f\": container with ID starting with fd634117e442eebc37d960962dbc57c319fd0a3e2ede9057caf210d4e1cf961f not found: ID does not exist" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.234907 4783 scope.go:117] "RemoveContainer" containerID="f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406" Jan 31 09:08:23 crc kubenswrapper[4783]: E0131 09:08:23.235467 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\": container with ID starting with f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406 not found: ID does not exist" containerID="f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.235527 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406"} err="failed to get container status \"f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\": rpc error: code = NotFound desc = could not find container \"f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406\": container with ID starting with f1cded24f4a64408514bee7175893c3f4c0f85a45328b335bf71312b200d4406 not found: ID does not exist" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.235564 4783 scope.go:117] "RemoveContainer" containerID="c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7" Jan 31 09:08:23 crc kubenswrapper[4783]: E0131 09:08:23.235894 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\": container with ID starting with c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7 not found: ID does not exist" containerID="c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.235938 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7"} err="failed to get container status \"c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\": rpc error: code = NotFound desc = could not find container \"c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7\": container with ID starting with c5319b6837d696dbf43d204528c1cbdefc76cfaf6c314ea647af8c80731659a7 not found: ID does not exist" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.235955 4783 scope.go:117] "RemoveContainer" containerID="53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410" Jan 31 09:08:23 crc kubenswrapper[4783]: E0131 09:08:23.236354 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\": container with ID starting with 53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410 not found: ID does not exist" containerID="53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.236484 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410"} err="failed to get container status \"53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\": rpc error: code = NotFound desc = could not find container \"53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410\": container with ID starting with 53eb63b9a88cf8a538093d039c9e69530c7c73a5000f1b01a0c058a58e58a410 not found: ID does not exist" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.236571 4783 scope.go:117] "RemoveContainer" containerID="8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357" Jan 31 09:08:23 crc kubenswrapper[4783]: E0131 09:08:23.236939 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\": container with ID starting with 8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357 not found: ID does not exist" containerID="8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.236973 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357"} err="failed to get container status \"8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\": rpc error: code = NotFound desc = could not find container \"8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357\": container with ID starting with 8f952959f94378a60d005ba272e1d494d5e1c4925cc5a55f2b447a3f406db357 not found: ID does not exist" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.236996 4783 scope.go:117] "RemoveContainer" containerID="c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef" Jan 31 09:08:23 crc kubenswrapper[4783]: E0131 09:08:23.237315 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\": container with ID starting with c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef not found: ID does not exist" containerID="c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.237350 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef"} err="failed to get container status \"c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\": rpc error: code = NotFound desc = could not find container \"c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef\": container with ID starting with c7909524efa2065e20fb2bd7da323638f1f5e2309baa1afa994e48a43bff6eef not found: ID does not exist" Jan 31 09:08:23 crc kubenswrapper[4783]: E0131 09:08:23.457593 4783 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:23 crc kubenswrapper[4783]: E0131 09:08:23.458276 4783 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:23 crc kubenswrapper[4783]: E0131 09:08:23.458543 4783 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:23 crc kubenswrapper[4783]: E0131 09:08:23.458735 4783 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:23 crc kubenswrapper[4783]: E0131 09:08:23.458916 4783 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.458938 4783 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 31 09:08:23 crc kubenswrapper[4783]: E0131 09:08:23.459103 4783 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.246:6443: connect: connection refused" interval="200ms" Jan 31 09:08:23 crc kubenswrapper[4783]: I0131 09:08:23.652176 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 31 09:08:23 crc kubenswrapper[4783]: E0131 09:08:23.659824 4783 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.246:6443: connect: connection refused" interval="400ms" Jan 31 09:08:24 crc kubenswrapper[4783]: E0131 09:08:24.060770 4783 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.246:6443: connect: connection refused" interval="800ms" Jan 31 09:08:24 crc kubenswrapper[4783]: E0131 09:08:24.861349 4783 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.246:6443: connect: connection refused" interval="1.6s" Jan 31 09:08:25 crc kubenswrapper[4783]: E0131 09:08:25.390205 4783 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.26.246:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:25 crc kubenswrapper[4783]: I0131 09:08:25.390631 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:25 crc kubenswrapper[4783]: W0131 09:08:25.408878 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-876d0d2bb659163c368ac7801ce72debe0f7b29182b2515f5066f4904274d8b8 WatchSource:0}: Error finding container 876d0d2bb659163c368ac7801ce72debe0f7b29182b2515f5066f4904274d8b8: Status 404 returned error can't find the container with id 876d0d2bb659163c368ac7801ce72debe0f7b29182b2515f5066f4904274d8b8 Jan 31 09:08:25 crc kubenswrapper[4783]: E0131 09:08:25.411810 4783 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.246:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188fc5a85897d107 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 09:08:25.411416327 +0000 UTC m=+216.080099795,LastTimestamp:2026-01-31 09:08:25.411416327 +0000 UTC m=+216.080099795,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 09:08:26 crc kubenswrapper[4783]: I0131 09:08:26.168289 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"43038965fd9d645f110f292b9a9323c55d75c695ed1312fc49d378a328ed7be5"} Jan 31 09:08:26 crc kubenswrapper[4783]: I0131 09:08:26.169001 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"876d0d2bb659163c368ac7801ce72debe0f7b29182b2515f5066f4904274d8b8"} Jan 31 09:08:26 crc kubenswrapper[4783]: I0131 09:08:26.169624 4783 status_manager.go:851] "Failed to get status for pod" podUID="4ecd2b3c-2ff6-4a90-b525-e262382bd09f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:26 crc kubenswrapper[4783]: E0131 09:08:26.169691 4783 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.26.246:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:08:26 crc kubenswrapper[4783]: E0131 09:08:26.279195 4783 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.246:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188fc5a85897d107 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 09:08:25.411416327 +0000 UTC m=+216.080099795,LastTimestamp:2026-01-31 09:08:25.411416327 +0000 UTC m=+216.080099795,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 09:08:26 crc kubenswrapper[4783]: E0131 09:08:26.462406 4783 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.246:6443: connect: connection refused" interval="3.2s" Jan 31 09:08:29 crc kubenswrapper[4783]: I0131 09:08:29.648712 4783 status_manager.go:851] "Failed to get status for pod" podUID="4ecd2b3c-2ff6-4a90-b525-e262382bd09f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:29 crc kubenswrapper[4783]: E0131 09:08:29.663247 4783 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.246:6443: connect: connection refused" interval="6.4s" Jan 31 09:08:33 crc kubenswrapper[4783]: I0131 09:08:33.645453 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:33 crc kubenswrapper[4783]: I0131 09:08:33.647484 4783 status_manager.go:851] "Failed to get status for pod" podUID="4ecd2b3c-2ff6-4a90-b525-e262382bd09f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:33 crc kubenswrapper[4783]: I0131 09:08:33.658517 4783 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d7dcdae0-fa0c-477f-93ca-03afdca81d43" Jan 31 09:08:33 crc kubenswrapper[4783]: I0131 09:08:33.658555 4783 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d7dcdae0-fa0c-477f-93ca-03afdca81d43" Jan 31 09:08:33 crc kubenswrapper[4783]: E0131 09:08:33.658973 4783 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:33 crc kubenswrapper[4783]: I0131 09:08:33.659338 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:34 crc kubenswrapper[4783]: I0131 09:08:34.213213 4783 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="fa2cbd5d038574d1cd269078d1f977d7c27e0f5b4ca86eb2d4422b9af403a668" exitCode=0 Jan 31 09:08:34 crc kubenswrapper[4783]: I0131 09:08:34.213309 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"fa2cbd5d038574d1cd269078d1f977d7c27e0f5b4ca86eb2d4422b9af403a668"} Jan 31 09:08:34 crc kubenswrapper[4783]: I0131 09:08:34.213367 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5483cbb64e4e6df523a57ea831efa544d79c58fa40e9a76aa72baceea4076a60"} Jan 31 09:08:34 crc kubenswrapper[4783]: I0131 09:08:34.213660 4783 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d7dcdae0-fa0c-477f-93ca-03afdca81d43" Jan 31 09:08:34 crc kubenswrapper[4783]: I0131 09:08:34.213686 4783 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d7dcdae0-fa0c-477f-93ca-03afdca81d43" Jan 31 09:08:34 crc kubenswrapper[4783]: I0131 09:08:34.214244 4783 status_manager.go:851] "Failed to get status for pod" podUID="4ecd2b3c-2ff6-4a90-b525-e262382bd09f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:34 crc kubenswrapper[4783]: E0131 09:08:34.214436 4783 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:34 crc kubenswrapper[4783]: I0131 09:08:34.216732 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 31 09:08:34 crc kubenswrapper[4783]: I0131 09:08:34.216781 4783 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932" exitCode=1 Jan 31 09:08:34 crc kubenswrapper[4783]: I0131 09:08:34.216815 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932"} Jan 31 09:08:34 crc kubenswrapper[4783]: I0131 09:08:34.217211 4783 scope.go:117] "RemoveContainer" containerID="e95ddefb0ef8ef77446184bcfcef0425695ad661467edecd02fce37255866932" Jan 31 09:08:34 crc kubenswrapper[4783]: I0131 09:08:34.217622 4783 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:34 crc kubenswrapper[4783]: I0131 09:08:34.217986 4783 status_manager.go:851] "Failed to get status for pod" podUID="4ecd2b3c-2ff6-4a90-b525-e262382bd09f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.246:6443: connect: connection refused" Jan 31 09:08:35 crc kubenswrapper[4783]: I0131 09:08:35.227863 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ab6992428e1cc81c91ecfd52da33d94a1873d98e7b3dc44868e1bc73f0d0d6b5"} Jan 31 09:08:35 crc kubenswrapper[4783]: I0131 09:08:35.228336 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6298629903c46bc7ce2dcbcdde76ed2b2a0fdef28dee7d68f95ef8a50fda967e"} Jan 31 09:08:35 crc kubenswrapper[4783]: I0131 09:08:35.228353 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d98508b37264ed0ea493a8aa9dbd0afe8bad1e445c991a646544b7bcd29ff5b0"} Jan 31 09:08:35 crc kubenswrapper[4783]: I0131 09:08:35.228364 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ba3721ccc2ccd605d4662b54ba5cbafcf8fc2828628b294c681b263e8cfb97e3"} Jan 31 09:08:35 crc kubenswrapper[4783]: I0131 09:08:35.228373 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3467cb4ff69b27ccfcf3905eb4104aa4cb97b3c9f7949263bbc962942af33068"} Jan 31 09:08:35 crc kubenswrapper[4783]: I0131 09:08:35.228641 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:35 crc kubenswrapper[4783]: I0131 09:08:35.228661 4783 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d7dcdae0-fa0c-477f-93ca-03afdca81d43" Jan 31 09:08:35 crc kubenswrapper[4783]: I0131 09:08:35.228681 4783 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d7dcdae0-fa0c-477f-93ca-03afdca81d43" Jan 31 09:08:35 crc kubenswrapper[4783]: I0131 09:08:35.233093 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 31 09:08:35 crc kubenswrapper[4783]: I0131 09:08:35.233197 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"854b9e6dbe3a0f8cc1513b3d900f57722454b02e88bf69387abd26cf6bb91984"} Jan 31 09:08:37 crc kubenswrapper[4783]: I0131 09:08:37.250890 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:08:37 crc kubenswrapper[4783]: I0131 09:08:37.254629 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:08:38 crc kubenswrapper[4783]: I0131 09:08:38.249715 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:08:38 crc kubenswrapper[4783]: I0131 09:08:38.659708 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:38 crc kubenswrapper[4783]: I0131 09:08:38.659785 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:38 crc kubenswrapper[4783]: I0131 09:08:38.664473 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:40 crc kubenswrapper[4783]: I0131 09:08:40.695838 4783 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:40 crc kubenswrapper[4783]: I0131 09:08:40.763883 4783 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d797e387-b8fd-4ba2-8547-de356e8c3772" Jan 31 09:08:41 crc kubenswrapper[4783]: I0131 09:08:41.265355 4783 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d7dcdae0-fa0c-477f-93ca-03afdca81d43" Jan 31 09:08:41 crc kubenswrapper[4783]: I0131 09:08:41.265400 4783 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d7dcdae0-fa0c-477f-93ca-03afdca81d43" Jan 31 09:08:41 crc kubenswrapper[4783]: I0131 09:08:41.268442 4783 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d797e387-b8fd-4ba2-8547-de356e8c3772" Jan 31 09:08:41 crc kubenswrapper[4783]: I0131 09:08:41.269124 4783 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://3467cb4ff69b27ccfcf3905eb4104aa4cb97b3c9f7949263bbc962942af33068" Jan 31 09:08:41 crc kubenswrapper[4783]: I0131 09:08:41.269154 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:08:42 crc kubenswrapper[4783]: I0131 09:08:42.270458 4783 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d7dcdae0-fa0c-477f-93ca-03afdca81d43" Jan 31 09:08:42 crc kubenswrapper[4783]: I0131 09:08:42.270728 4783 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d7dcdae0-fa0c-477f-93ca-03afdca81d43" Jan 31 09:08:42 crc kubenswrapper[4783]: I0131 09:08:42.272866 4783 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d797e387-b8fd-4ba2-8547-de356e8c3772" Jan 31 09:08:50 crc kubenswrapper[4783]: I0131 09:08:50.372899 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 31 09:08:50 crc kubenswrapper[4783]: I0131 09:08:50.572956 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 31 09:08:50 crc kubenswrapper[4783]: I0131 09:08:50.972894 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 31 09:08:51 crc kubenswrapper[4783]: I0131 09:08:51.129023 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 31 09:08:51 crc kubenswrapper[4783]: I0131 09:08:51.205716 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 31 09:08:51 crc kubenswrapper[4783]: I0131 09:08:51.294613 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 31 09:08:51 crc kubenswrapper[4783]: I0131 09:08:51.668120 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 31 09:08:51 crc kubenswrapper[4783]: I0131 09:08:51.950465 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 31 09:08:51 crc kubenswrapper[4783]: I0131 09:08:51.989322 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 31 09:08:52 crc kubenswrapper[4783]: I0131 09:08:52.193080 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 09:08:52 crc kubenswrapper[4783]: I0131 09:08:52.471003 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 31 09:08:52 crc kubenswrapper[4783]: I0131 09:08:52.608785 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 31 09:08:52 crc kubenswrapper[4783]: I0131 09:08:52.639516 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 31 09:08:52 crc kubenswrapper[4783]: I0131 09:08:52.805847 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 31 09:08:52 crc kubenswrapper[4783]: I0131 09:08:52.938083 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:08:53 crc kubenswrapper[4783]: I0131 09:08:53.153137 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 31 09:08:53 crc kubenswrapper[4783]: I0131 09:08:53.163327 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 31 09:08:53 crc kubenswrapper[4783]: I0131 09:08:53.301906 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 31 09:08:53 crc kubenswrapper[4783]: I0131 09:08:53.447099 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 31 09:08:53 crc kubenswrapper[4783]: I0131 09:08:53.486920 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 31 09:08:53 crc kubenswrapper[4783]: I0131 09:08:53.723335 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 31 09:08:53 crc kubenswrapper[4783]: I0131 09:08:53.894434 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 31 09:08:53 crc kubenswrapper[4783]: I0131 09:08:53.919994 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 31 09:08:54 crc kubenswrapper[4783]: I0131 09:08:54.013841 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 31 09:08:54 crc kubenswrapper[4783]: I0131 09:08:54.068705 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 31 09:08:54 crc kubenswrapper[4783]: I0131 09:08:54.259009 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 31 09:08:54 crc kubenswrapper[4783]: I0131 09:08:54.272001 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 31 09:08:54 crc kubenswrapper[4783]: I0131 09:08:54.309670 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 31 09:08:54 crc kubenswrapper[4783]: I0131 09:08:54.434764 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 31 09:08:54 crc kubenswrapper[4783]: I0131 09:08:54.435995 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 31 09:08:54 crc kubenswrapper[4783]: I0131 09:08:54.490226 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 31 09:08:54 crc kubenswrapper[4783]: I0131 09:08:54.529753 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 31 09:08:54 crc kubenswrapper[4783]: I0131 09:08:54.659199 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 31 09:08:54 crc kubenswrapper[4783]: I0131 09:08:54.678933 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 31 09:08:54 crc kubenswrapper[4783]: I0131 09:08:54.721797 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 31 09:08:54 crc kubenswrapper[4783]: I0131 09:08:54.918696 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.048039 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.067866 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.096258 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.116540 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.243647 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.263013 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.276320 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.329382 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.340012 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.355085 4783 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.496551 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.503893 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.552296 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.556329 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.565081 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.567986 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.827449 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 31 09:08:55 crc kubenswrapper[4783]: I0131 09:08:55.841926 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 31 09:08:56 crc kubenswrapper[4783]: I0131 09:08:56.026287 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 31 09:08:56 crc kubenswrapper[4783]: I0131 09:08:56.056072 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 31 09:08:56 crc kubenswrapper[4783]: I0131 09:08:56.090727 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 31 09:08:56 crc kubenswrapper[4783]: I0131 09:08:56.108307 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 31 09:08:56 crc kubenswrapper[4783]: I0131 09:08:56.202536 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 31 09:08:56 crc kubenswrapper[4783]: I0131 09:08:56.311427 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 31 09:08:56 crc kubenswrapper[4783]: I0131 09:08:56.348364 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 31 09:08:56 crc kubenswrapper[4783]: I0131 09:08:56.532230 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 31 09:08:56 crc kubenswrapper[4783]: I0131 09:08:56.719782 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 31 09:08:56 crc kubenswrapper[4783]: I0131 09:08:56.724044 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 31 09:08:56 crc kubenswrapper[4783]: I0131 09:08:56.768955 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 31 09:08:56 crc kubenswrapper[4783]: I0131 09:08:56.819778 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 31 09:08:56 crc kubenswrapper[4783]: I0131 09:08:56.900265 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 31 09:08:57 crc kubenswrapper[4783]: I0131 09:08:57.053022 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 31 09:08:57 crc kubenswrapper[4783]: I0131 09:08:57.166840 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 31 09:08:57 crc kubenswrapper[4783]: I0131 09:08:57.394638 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 31 09:08:57 crc kubenswrapper[4783]: I0131 09:08:57.404296 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 31 09:08:57 crc kubenswrapper[4783]: I0131 09:08:57.591702 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 31 09:08:57 crc kubenswrapper[4783]: I0131 09:08:57.649494 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 31 09:08:57 crc kubenswrapper[4783]: I0131 09:08:57.671130 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 31 09:08:57 crc kubenswrapper[4783]: I0131 09:08:57.728264 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 31 09:08:57 crc kubenswrapper[4783]: I0131 09:08:57.739999 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 31 09:08:57 crc kubenswrapper[4783]: I0131 09:08:57.761601 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 31 09:08:57 crc kubenswrapper[4783]: I0131 09:08:57.775843 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 31 09:08:57 crc kubenswrapper[4783]: I0131 09:08:57.810496 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 31 09:08:57 crc kubenswrapper[4783]: I0131 09:08:57.848820 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 31 09:08:57 crc kubenswrapper[4783]: I0131 09:08:57.851671 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 31 09:08:57 crc kubenswrapper[4783]: I0131 09:08:57.908025 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.024626 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.064960 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.133096 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.141495 4783 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.151642 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.288586 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.307990 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.321192 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.335214 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.416054 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.528616 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.551052 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.554859 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.562644 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.600994 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.616695 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.630894 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.654048 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.712066 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.716428 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.728395 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.828194 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.839937 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.883678 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.896585 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 31 09:08:58 crc kubenswrapper[4783]: I0131 09:08:58.939365 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 31 09:08:59 crc kubenswrapper[4783]: I0131 09:08:59.026848 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 31 09:08:59 crc kubenswrapper[4783]: I0131 09:08:59.037995 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 31 09:08:59 crc kubenswrapper[4783]: I0131 09:08:59.077645 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 31 09:08:59 crc kubenswrapper[4783]: I0131 09:08:59.167650 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 31 09:08:59 crc kubenswrapper[4783]: I0131 09:08:59.175756 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 31 09:08:59 crc kubenswrapper[4783]: I0131 09:08:59.203766 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 31 09:08:59 crc kubenswrapper[4783]: I0131 09:08:59.275364 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 31 09:08:59 crc kubenswrapper[4783]: I0131 09:08:59.299534 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 09:08:59 crc kubenswrapper[4783]: I0131 09:08:59.331829 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 31 09:08:59 crc kubenswrapper[4783]: I0131 09:08:59.404755 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 31 09:08:59 crc kubenswrapper[4783]: I0131 09:08:59.548088 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 31 09:08:59 crc kubenswrapper[4783]: I0131 09:08:59.612552 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 31 09:08:59 crc kubenswrapper[4783]: I0131 09:08:59.643408 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 31 09:08:59 crc kubenswrapper[4783]: I0131 09:08:59.662664 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 31 09:08:59 crc kubenswrapper[4783]: I0131 09:08:59.878236 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 31 09:08:59 crc kubenswrapper[4783]: I0131 09:08:59.957460 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.012397 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.080035 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.171026 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.193917 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.207896 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.208177 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.243694 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.265936 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.286253 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.296181 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.315211 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.349328 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.369331 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.369581 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.372506 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.431798 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.502352 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.506553 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.589211 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.604505 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.678539 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.724121 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.830460 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.935736 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 31 09:09:00 crc kubenswrapper[4783]: I0131 09:09:00.946022 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.020546 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.025707 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.034738 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.038562 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.114954 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.158814 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.171318 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.209121 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.261057 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.287430 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.309456 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.534285 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.557759 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.651486 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.680437 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.831079 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.832957 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.865998 4783 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.894570 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.918678 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.936284 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.958525 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 31 09:09:01 crc kubenswrapper[4783]: I0131 09:09:01.982151 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.017450 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.093886 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.137735 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.181735 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.193665 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.249497 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.384097 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.450213 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.454993 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.468838 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.535857 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.608917 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.792792 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.818005 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.831469 4783 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.835356 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.836355 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.836407 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.836741 4783 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d7dcdae0-fa0c-477f-93ca-03afdca81d43" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.836777 4783 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d7dcdae0-fa0c-477f-93ca-03afdca81d43" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.839680 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.843058 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.848725 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.848713213 podStartE2EDuration="22.848713213s" podCreationTimestamp="2026-01-31 09:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:09:02.848364717 +0000 UTC m=+253.517048185" watchObservedRunningTime="2026-01-31 09:09:02.848713213 +0000 UTC m=+253.517396680" Jan 31 09:09:02 crc kubenswrapper[4783]: I0131 09:09:02.984629 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.037536 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.049104 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.100699 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.171641 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.357236 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.376104 4783 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.376471 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://43038965fd9d645f110f292b9a9323c55d75c695ed1312fc49d378a328ed7be5" gracePeriod=5 Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.428305 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.465283 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.480939 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.515109 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.529863 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.591845 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.692362 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.724687 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.933033 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.956129 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.958988 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 31 09:09:03 crc kubenswrapper[4783]: I0131 09:09:03.977646 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 31 09:09:04 crc kubenswrapper[4783]: I0131 09:09:04.194097 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 31 09:09:04 crc kubenswrapper[4783]: I0131 09:09:04.355273 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 31 09:09:04 crc kubenswrapper[4783]: I0131 09:09:04.409368 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 31 09:09:04 crc kubenswrapper[4783]: I0131 09:09:04.602128 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 31 09:09:04 crc kubenswrapper[4783]: I0131 09:09:04.693051 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 31 09:09:04 crc kubenswrapper[4783]: I0131 09:09:04.705902 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 31 09:09:04 crc kubenswrapper[4783]: I0131 09:09:04.706569 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 31 09:09:04 crc kubenswrapper[4783]: I0131 09:09:04.776961 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 31 09:09:04 crc kubenswrapper[4783]: I0131 09:09:04.969822 4783 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 31 09:09:05 crc kubenswrapper[4783]: I0131 09:09:05.012749 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 31 09:09:05 crc kubenswrapper[4783]: I0131 09:09:05.052132 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 31 09:09:05 crc kubenswrapper[4783]: I0131 09:09:05.062652 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 31 09:09:05 crc kubenswrapper[4783]: I0131 09:09:05.257441 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 31 09:09:05 crc kubenswrapper[4783]: I0131 09:09:05.312603 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 31 09:09:05 crc kubenswrapper[4783]: I0131 09:09:05.347298 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 31 09:09:05 crc kubenswrapper[4783]: I0131 09:09:05.379443 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 31 09:09:05 crc kubenswrapper[4783]: I0131 09:09:05.408158 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 31 09:09:05 crc kubenswrapper[4783]: I0131 09:09:05.443659 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 09:09:05 crc kubenswrapper[4783]: I0131 09:09:05.527093 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 31 09:09:05 crc kubenswrapper[4783]: I0131 09:09:05.630058 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 31 09:09:05 crc kubenswrapper[4783]: I0131 09:09:05.707143 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 31 09:09:05 crc kubenswrapper[4783]: I0131 09:09:05.813618 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 31 09:09:06 crc kubenswrapper[4783]: I0131 09:09:06.006537 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 09:09:06 crc kubenswrapper[4783]: I0131 09:09:06.010835 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 31 09:09:06 crc kubenswrapper[4783]: I0131 09:09:06.061588 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 31 09:09:06 crc kubenswrapper[4783]: I0131 09:09:06.245724 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 31 09:09:06 crc kubenswrapper[4783]: I0131 09:09:06.256982 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 31 09:09:06 crc kubenswrapper[4783]: I0131 09:09:06.277492 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 31 09:09:06 crc kubenswrapper[4783]: I0131 09:09:06.290689 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 31 09:09:06 crc kubenswrapper[4783]: I0131 09:09:06.373646 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 31 09:09:06 crc kubenswrapper[4783]: I0131 09:09:06.484614 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 31 09:09:06 crc kubenswrapper[4783]: I0131 09:09:06.521543 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 31 09:09:06 crc kubenswrapper[4783]: I0131 09:09:06.543909 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 31 09:09:06 crc kubenswrapper[4783]: I0131 09:09:06.800555 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 31 09:09:06 crc kubenswrapper[4783]: I0131 09:09:06.891472 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 31 09:09:06 crc kubenswrapper[4783]: I0131 09:09:06.921837 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 31 09:09:06 crc kubenswrapper[4783]: I0131 09:09:06.972703 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 09:09:07 crc kubenswrapper[4783]: I0131 09:09:07.067910 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 31 09:09:07 crc kubenswrapper[4783]: I0131 09:09:07.093060 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 31 09:09:07 crc kubenswrapper[4783]: I0131 09:09:07.266350 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 31 09:09:07 crc kubenswrapper[4783]: I0131 09:09:07.266559 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 31 09:09:07 crc kubenswrapper[4783]: I0131 09:09:07.544209 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 31 09:09:07 crc kubenswrapper[4783]: I0131 09:09:07.793636 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 31 09:09:07 crc kubenswrapper[4783]: I0131 09:09:07.902449 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 31 09:09:08 crc kubenswrapper[4783]: I0131 09:09:08.062518 4783 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 31 09:09:08 crc kubenswrapper[4783]: I0131 09:09:08.156073 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 31 09:09:08 crc kubenswrapper[4783]: I0131 09:09:08.421094 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 31 09:09:08 crc kubenswrapper[4783]: I0131 09:09:08.421365 4783 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="43038965fd9d645f110f292b9a9323c55d75c695ed1312fc49d378a328ed7be5" exitCode=137 Jan 31 09:09:08 crc kubenswrapper[4783]: I0131 09:09:08.930904 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 31 09:09:08 crc kubenswrapper[4783]: I0131 09:09:08.930992 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.129644 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.129744 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.129753 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.129837 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.129891 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.129935 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.129937 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.129991 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.130039 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.130269 4783 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.130282 4783 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.130293 4783 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.130302 4783 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.138015 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.231605 4783 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.428040 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.428350 4783 scope.go:117] "RemoveContainer" containerID="43038965fd9d645f110f292b9a9323c55d75c695ed1312fc49d378a328ed7be5" Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.428404 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:09:09 crc kubenswrapper[4783]: I0131 09:09:09.651572 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 31 09:09:36 crc kubenswrapper[4783]: I0131 09:09:36.546633 4783 generic.go:334] "Generic (PLEG): container finished" podID="6e7fa19e-aa64-4479-805e-62625ccc19b8" containerID="e607c8635c5dd717ad1494910fa23df553867e7c78b929f077a9828f4f0d7133" exitCode=0 Jan 31 09:09:36 crc kubenswrapper[4783]: I0131 09:09:36.546737 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-729tq" event={"ID":"6e7fa19e-aa64-4479-805e-62625ccc19b8","Type":"ContainerDied","Data":"e607c8635c5dd717ad1494910fa23df553867e7c78b929f077a9828f4f0d7133"} Jan 31 09:09:36 crc kubenswrapper[4783]: I0131 09:09:36.547552 4783 scope.go:117] "RemoveContainer" containerID="e607c8635c5dd717ad1494910fa23df553867e7c78b929f077a9828f4f0d7133" Jan 31 09:09:37 crc kubenswrapper[4783]: I0131 09:09:37.554830 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-729tq" event={"ID":"6e7fa19e-aa64-4479-805e-62625ccc19b8","Type":"ContainerStarted","Data":"8f0fd6fef1826b265eace4d5bc76811ecb19b9e3741c0f7200e67da52910acb8"} Jan 31 09:09:37 crc kubenswrapper[4783]: I0131 09:09:37.555565 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-729tq" Jan 31 09:09:37 crc kubenswrapper[4783]: I0131 09:09:37.556839 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-729tq" Jan 31 09:09:49 crc kubenswrapper[4783]: I0131 09:09:49.542011 4783 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.293975 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-skr4f"] Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.295095 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" podUID="4ba51b72-4b47-44ce-ba01-39cfe7bacf6d" containerName="controller-manager" containerID="cri-o://4210581ab6a01075fcdd091aaec800ede9b181e1e6a5eefe255e34258002feb6" gracePeriod=30 Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.384928 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db"] Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.385157 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" podUID="175323b1-b0a0-4811-a2c7-4c98ee3a5b56" containerName="route-controller-manager" containerID="cri-o://af44c122098f051b22603517b178c92898c2c932ef94e112efcff6ba583811fb" gracePeriod=30 Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.594244 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.661175 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.661568 4783 generic.go:334] "Generic (PLEG): container finished" podID="4ba51b72-4b47-44ce-ba01-39cfe7bacf6d" containerID="4210581ab6a01075fcdd091aaec800ede9b181e1e6a5eefe255e34258002feb6" exitCode=0 Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.661619 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.661662 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" event={"ID":"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d","Type":"ContainerDied","Data":"4210581ab6a01075fcdd091aaec800ede9b181e1e6a5eefe255e34258002feb6"} Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.661696 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-skr4f" event={"ID":"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d","Type":"ContainerDied","Data":"7338079f256bc10b53448708c2288e787324889d66922e9afc308f56c3ac2406"} Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.661741 4783 scope.go:117] "RemoveContainer" containerID="4210581ab6a01075fcdd091aaec800ede9b181e1e6a5eefe255e34258002feb6" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.664840 4783 generic.go:334] "Generic (PLEG): container finished" podID="175323b1-b0a0-4811-a2c7-4c98ee3a5b56" containerID="af44c122098f051b22603517b178c92898c2c932ef94e112efcff6ba583811fb" exitCode=0 Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.664890 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" event={"ID":"175323b1-b0a0-4811-a2c7-4c98ee3a5b56","Type":"ContainerDied","Data":"af44c122098f051b22603517b178c92898c2c932ef94e112efcff6ba583811fb"} Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.664917 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.664938 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db" event={"ID":"175323b1-b0a0-4811-a2c7-4c98ee3a5b56","Type":"ContainerDied","Data":"dae54553c9ef94b565eb5a57a9d5d22ef453c5b971e381a70d401ebfcb44b90d"} Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.682119 4783 scope.go:117] "RemoveContainer" containerID="4210581ab6a01075fcdd091aaec800ede9b181e1e6a5eefe255e34258002feb6" Jan 31 09:09:59 crc kubenswrapper[4783]: E0131 09:09:59.682492 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4210581ab6a01075fcdd091aaec800ede9b181e1e6a5eefe255e34258002feb6\": container with ID starting with 4210581ab6a01075fcdd091aaec800ede9b181e1e6a5eefe255e34258002feb6 not found: ID does not exist" containerID="4210581ab6a01075fcdd091aaec800ede9b181e1e6a5eefe255e34258002feb6" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.682532 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4210581ab6a01075fcdd091aaec800ede9b181e1e6a5eefe255e34258002feb6"} err="failed to get container status \"4210581ab6a01075fcdd091aaec800ede9b181e1e6a5eefe255e34258002feb6\": rpc error: code = NotFound desc = could not find container \"4210581ab6a01075fcdd091aaec800ede9b181e1e6a5eefe255e34258002feb6\": container with ID starting with 4210581ab6a01075fcdd091aaec800ede9b181e1e6a5eefe255e34258002feb6 not found: ID does not exist" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.682563 4783 scope.go:117] "RemoveContainer" containerID="af44c122098f051b22603517b178c92898c2c932ef94e112efcff6ba583811fb" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.694072 4783 scope.go:117] "RemoveContainer" containerID="af44c122098f051b22603517b178c92898c2c932ef94e112efcff6ba583811fb" Jan 31 09:09:59 crc kubenswrapper[4783]: E0131 09:09:59.694765 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af44c122098f051b22603517b178c92898c2c932ef94e112efcff6ba583811fb\": container with ID starting with af44c122098f051b22603517b178c92898c2c932ef94e112efcff6ba583811fb not found: ID does not exist" containerID="af44c122098f051b22603517b178c92898c2c932ef94e112efcff6ba583811fb" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.694797 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af44c122098f051b22603517b178c92898c2c932ef94e112efcff6ba583811fb"} err="failed to get container status \"af44c122098f051b22603517b178c92898c2c932ef94e112efcff6ba583811fb\": rpc error: code = NotFound desc = could not find container \"af44c122098f051b22603517b178c92898c2c932ef94e112efcff6ba583811fb\": container with ID starting with af44c122098f051b22603517b178c92898c2c932ef94e112efcff6ba583811fb not found: ID does not exist" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.763989 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnblf\" (UniqueName: \"kubernetes.io/projected/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-kube-api-access-mnblf\") pod \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\" (UID: \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\") " Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.764055 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-serving-cert\") pod \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\" (UID: \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\") " Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.764103 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-config\") pod \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.764132 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-client-ca\") pod \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.764180 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt5m2\" (UniqueName: \"kubernetes.io/projected/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-kube-api-access-bt5m2\") pod \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.764223 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-serving-cert\") pod \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.764267 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-proxy-ca-bundles\") pod \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\" (UID: \"4ba51b72-4b47-44ce-ba01-39cfe7bacf6d\") " Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.764310 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-client-ca\") pod \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\" (UID: \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\") " Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.764335 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-config\") pod \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\" (UID: \"175323b1-b0a0-4811-a2c7-4c98ee3a5b56\") " Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.765414 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-client-ca" (OuterVolumeSpecName: "client-ca") pod "4ba51b72-4b47-44ce-ba01-39cfe7bacf6d" (UID: "4ba51b72-4b47-44ce-ba01-39cfe7bacf6d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.765486 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-config" (OuterVolumeSpecName: "config") pod "175323b1-b0a0-4811-a2c7-4c98ee3a5b56" (UID: "175323b1-b0a0-4811-a2c7-4c98ee3a5b56"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.765542 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-client-ca" (OuterVolumeSpecName: "client-ca") pod "175323b1-b0a0-4811-a2c7-4c98ee3a5b56" (UID: "175323b1-b0a0-4811-a2c7-4c98ee3a5b56"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.765930 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-config" (OuterVolumeSpecName: "config") pod "4ba51b72-4b47-44ce-ba01-39cfe7bacf6d" (UID: "4ba51b72-4b47-44ce-ba01-39cfe7bacf6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.766298 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4ba51b72-4b47-44ce-ba01-39cfe7bacf6d" (UID: "4ba51b72-4b47-44ce-ba01-39cfe7bacf6d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.770991 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-kube-api-access-mnblf" (OuterVolumeSpecName: "kube-api-access-mnblf") pod "175323b1-b0a0-4811-a2c7-4c98ee3a5b56" (UID: "175323b1-b0a0-4811-a2c7-4c98ee3a5b56"). InnerVolumeSpecName "kube-api-access-mnblf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.771336 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "175323b1-b0a0-4811-a2c7-4c98ee3a5b56" (UID: "175323b1-b0a0-4811-a2c7-4c98ee3a5b56"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.771442 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-kube-api-access-bt5m2" (OuterVolumeSpecName: "kube-api-access-bt5m2") pod "4ba51b72-4b47-44ce-ba01-39cfe7bacf6d" (UID: "4ba51b72-4b47-44ce-ba01-39cfe7bacf6d"). InnerVolumeSpecName "kube-api-access-bt5m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.771672 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4ba51b72-4b47-44ce-ba01-39cfe7bacf6d" (UID: "4ba51b72-4b47-44ce-ba01-39cfe7bacf6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.866209 4783 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.866234 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.866245 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnblf\" (UniqueName: \"kubernetes.io/projected/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-kube-api-access-mnblf\") on node \"crc\" DevicePath \"\"" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.866267 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175323b1-b0a0-4811-a2c7-4c98ee3a5b56-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.866275 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.866283 4783 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.866292 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt5m2\" (UniqueName: \"kubernetes.io/projected/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-kube-api-access-bt5m2\") on node \"crc\" DevicePath \"\"" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.866300 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:09:59 crc kubenswrapper[4783]: I0131 09:09:59.866308 4783 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:00 crc kubenswrapper[4783]: I0131 09:09:59.996927 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-skr4f"] Jan 31 09:10:00 crc kubenswrapper[4783]: I0131 09:09:59.999909 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-skr4f"] Jan 31 09:10:00 crc kubenswrapper[4783]: I0131 09:10:00.011778 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db"] Jan 31 09:10:00 crc kubenswrapper[4783]: I0131 09:10:00.014244 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9m5db"] Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.011798 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s"] Jan 31 09:10:01 crc kubenswrapper[4783]: E0131 09:10:01.012783 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="175323b1-b0a0-4811-a2c7-4c98ee3a5b56" containerName="route-controller-manager" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.012827 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="175323b1-b0a0-4811-a2c7-4c98ee3a5b56" containerName="route-controller-manager" Jan 31 09:10:01 crc kubenswrapper[4783]: E0131 09:10:01.012842 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.012854 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 31 09:10:01 crc kubenswrapper[4783]: E0131 09:10:01.012862 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba51b72-4b47-44ce-ba01-39cfe7bacf6d" containerName="controller-manager" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.012871 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba51b72-4b47-44ce-ba01-39cfe7bacf6d" containerName="controller-manager" Jan 31 09:10:01 crc kubenswrapper[4783]: E0131 09:10:01.012889 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ecd2b3c-2ff6-4a90-b525-e262382bd09f" containerName="installer" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.012897 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ecd2b3c-2ff6-4a90-b525-e262382bd09f" containerName="installer" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.013060 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba51b72-4b47-44ce-ba01-39cfe7bacf6d" containerName="controller-manager" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.013077 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="175323b1-b0a0-4811-a2c7-4c98ee3a5b56" containerName="route-controller-manager" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.013087 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.013098 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ecd2b3c-2ff6-4a90-b525-e262382bd09f" containerName="installer" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.013771 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.014652 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c"] Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.015675 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.016122 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.016451 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.017586 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.017828 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.017848 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.018700 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.018971 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.020192 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.020402 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.022661 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s"] Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.024886 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.026918 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.026941 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.031264 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.031970 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c"] Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.184763 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-serving-cert\") pod \"route-controller-manager-6c457667dd-zdc6s\" (UID: \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\") " pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.184817 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-proxy-ca-bundles\") pod \"controller-manager-5574b8fdb4-s5q4c\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.184843 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ead2cc5-e399-4a3f-be5e-99199fd476f3-serving-cert\") pod \"controller-manager-5574b8fdb4-s5q4c\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.184863 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhntj\" (UniqueName: \"kubernetes.io/projected/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-kube-api-access-bhntj\") pod \"route-controller-manager-6c457667dd-zdc6s\" (UID: \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\") " pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.184899 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-config\") pod \"route-controller-manager-6c457667dd-zdc6s\" (UID: \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\") " pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.184934 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-config\") pod \"controller-manager-5574b8fdb4-s5q4c\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.184955 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5gbh\" (UniqueName: \"kubernetes.io/projected/3ead2cc5-e399-4a3f-be5e-99199fd476f3-kube-api-access-p5gbh\") pod \"controller-manager-5574b8fdb4-s5q4c\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.184973 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-client-ca\") pod \"route-controller-manager-6c457667dd-zdc6s\" (UID: \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\") " pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.184990 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-client-ca\") pod \"controller-manager-5574b8fdb4-s5q4c\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.285923 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ead2cc5-e399-4a3f-be5e-99199fd476f3-serving-cert\") pod \"controller-manager-5574b8fdb4-s5q4c\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.286289 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhntj\" (UniqueName: \"kubernetes.io/projected/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-kube-api-access-bhntj\") pod \"route-controller-manager-6c457667dd-zdc6s\" (UID: \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\") " pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.286327 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-config\") pod \"route-controller-manager-6c457667dd-zdc6s\" (UID: \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\") " pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.286369 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-config\") pod \"controller-manager-5574b8fdb4-s5q4c\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.286393 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5gbh\" (UniqueName: \"kubernetes.io/projected/3ead2cc5-e399-4a3f-be5e-99199fd476f3-kube-api-access-p5gbh\") pod \"controller-manager-5574b8fdb4-s5q4c\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.286415 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-client-ca\") pod \"route-controller-manager-6c457667dd-zdc6s\" (UID: \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\") " pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.286435 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-client-ca\") pod \"controller-manager-5574b8fdb4-s5q4c\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.286461 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-serving-cert\") pod \"route-controller-manager-6c457667dd-zdc6s\" (UID: \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\") " pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.286487 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-proxy-ca-bundles\") pod \"controller-manager-5574b8fdb4-s5q4c\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.287921 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-proxy-ca-bundles\") pod \"controller-manager-5574b8fdb4-s5q4c\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.287996 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-client-ca\") pod \"controller-manager-5574b8fdb4-s5q4c\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.288241 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-config\") pod \"route-controller-manager-6c457667dd-zdc6s\" (UID: \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\") " pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.288345 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-client-ca\") pod \"route-controller-manager-6c457667dd-zdc6s\" (UID: \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\") " pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.288527 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-config\") pod \"controller-manager-5574b8fdb4-s5q4c\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.290980 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ead2cc5-e399-4a3f-be5e-99199fd476f3-serving-cert\") pod \"controller-manager-5574b8fdb4-s5q4c\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.291083 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-serving-cert\") pod \"route-controller-manager-6c457667dd-zdc6s\" (UID: \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\") " pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.303508 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhntj\" (UniqueName: \"kubernetes.io/projected/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-kube-api-access-bhntj\") pod \"route-controller-manager-6c457667dd-zdc6s\" (UID: \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\") " pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.304293 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5gbh\" (UniqueName: \"kubernetes.io/projected/3ead2cc5-e399-4a3f-be5e-99199fd476f3-kube-api-access-p5gbh\") pod \"controller-manager-5574b8fdb4-s5q4c\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.329427 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.336871 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.653290 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="175323b1-b0a0-4811-a2c7-4c98ee3a5b56" path="/var/lib/kubelet/pods/175323b1-b0a0-4811-a2c7-4c98ee3a5b56/volumes" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.654046 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ba51b72-4b47-44ce-ba01-39cfe7bacf6d" path="/var/lib/kubelet/pods/4ba51b72-4b47-44ce-ba01-39cfe7bacf6d/volumes" Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.697916 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c"] Jan 31 09:10:01 crc kubenswrapper[4783]: I0131 09:10:01.720727 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s"] Jan 31 09:10:01 crc kubenswrapper[4783]: W0131 09:10:01.724897 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6a3881a_9e0a_4549_bd78_1c8ff44882e1.slice/crio-7d310f69d4d41dfb310a1db72bc1f6fd8f2fd1b570192d92846ab99029b772fe WatchSource:0}: Error finding container 7d310f69d4d41dfb310a1db72bc1f6fd8f2fd1b570192d92846ab99029b772fe: Status 404 returned error can't find the container with id 7d310f69d4d41dfb310a1db72bc1f6fd8f2fd1b570192d92846ab99029b772fe Jan 31 09:10:02 crc kubenswrapper[4783]: I0131 09:10:02.685285 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" event={"ID":"3ead2cc5-e399-4a3f-be5e-99199fd476f3","Type":"ContainerStarted","Data":"61a2923922aad2d873d8748d972f8587092b2ec9175de98140e799b7c7848d24"} Jan 31 09:10:02 crc kubenswrapper[4783]: I0131 09:10:02.685777 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:02 crc kubenswrapper[4783]: I0131 09:10:02.685792 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" event={"ID":"3ead2cc5-e399-4a3f-be5e-99199fd476f3","Type":"ContainerStarted","Data":"bc2c78be0e97d63b6537a16d975d6e96117dd2858cfa4255ed23db8ceae68d17"} Jan 31 09:10:02 crc kubenswrapper[4783]: I0131 09:10:02.687767 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" event={"ID":"d6a3881a-9e0a-4549-bd78-1c8ff44882e1","Type":"ContainerStarted","Data":"e164d303c79262b6d6573526cf84b4de1bfc0e27ab168ac1b8d0c076f6d25c40"} Jan 31 09:10:02 crc kubenswrapper[4783]: I0131 09:10:02.687817 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" event={"ID":"d6a3881a-9e0a-4549-bd78-1c8ff44882e1","Type":"ContainerStarted","Data":"7d310f69d4d41dfb310a1db72bc1f6fd8f2fd1b570192d92846ab99029b772fe"} Jan 31 09:10:02 crc kubenswrapper[4783]: I0131 09:10:02.687996 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:02 crc kubenswrapper[4783]: I0131 09:10:02.691605 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:02 crc kubenswrapper[4783]: I0131 09:10:02.692580 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:02 crc kubenswrapper[4783]: I0131 09:10:02.704662 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" podStartSLOduration=3.704649807 podStartE2EDuration="3.704649807s" podCreationTimestamp="2026-01-31 09:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:10:02.703631718 +0000 UTC m=+313.372315186" watchObservedRunningTime="2026-01-31 09:10:02.704649807 +0000 UTC m=+313.373333274" Jan 31 09:10:02 crc kubenswrapper[4783]: I0131 09:10:02.724424 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" podStartSLOduration=3.724391335 podStartE2EDuration="3.724391335s" podCreationTimestamp="2026-01-31 09:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:10:02.717054837 +0000 UTC m=+313.385738315" watchObservedRunningTime="2026-01-31 09:10:02.724391335 +0000 UTC m=+313.393074802" Jan 31 09:10:04 crc kubenswrapper[4783]: I0131 09:10:04.155633 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c"] Jan 31 09:10:04 crc kubenswrapper[4783]: I0131 09:10:04.159809 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s"] Jan 31 09:10:05 crc kubenswrapper[4783]: I0131 09:10:05.704107 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" podUID="3ead2cc5-e399-4a3f-be5e-99199fd476f3" containerName="controller-manager" containerID="cri-o://61a2923922aad2d873d8748d972f8587092b2ec9175de98140e799b7c7848d24" gracePeriod=30 Jan 31 09:10:05 crc kubenswrapper[4783]: I0131 09:10:05.704274 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" podUID="d6a3881a-9e0a-4549-bd78-1c8ff44882e1" containerName="route-controller-manager" containerID="cri-o://e164d303c79262b6d6573526cf84b4de1bfc0e27ab168ac1b8d0c076f6d25c40" gracePeriod=30 Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.075957 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.097037 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j"] Jan 31 09:10:06 crc kubenswrapper[4783]: E0131 09:10:06.097301 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a3881a-9e0a-4549-bd78-1c8ff44882e1" containerName="route-controller-manager" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.097320 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a3881a-9e0a-4549-bd78-1c8ff44882e1" containerName="route-controller-manager" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.097416 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6a3881a-9e0a-4549-bd78-1c8ff44882e1" containerName="route-controller-manager" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.097787 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.111430 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j"] Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.112502 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.152737 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-serving-cert\") pod \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\" (UID: \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\") " Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.152812 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhntj\" (UniqueName: \"kubernetes.io/projected/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-kube-api-access-bhntj\") pod \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\" (UID: \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\") " Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.152870 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-client-ca\") pod \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\" (UID: \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\") " Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.152966 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-config\") pod \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\" (UID: \"d6a3881a-9e0a-4549-bd78-1c8ff44882e1\") " Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.153797 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-client-ca" (OuterVolumeSpecName: "client-ca") pod "d6a3881a-9e0a-4549-bd78-1c8ff44882e1" (UID: "d6a3881a-9e0a-4549-bd78-1c8ff44882e1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.153989 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-config" (OuterVolumeSpecName: "config") pod "d6a3881a-9e0a-4549-bd78-1c8ff44882e1" (UID: "d6a3881a-9e0a-4549-bd78-1c8ff44882e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.159388 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-kube-api-access-bhntj" (OuterVolumeSpecName: "kube-api-access-bhntj") pod "d6a3881a-9e0a-4549-bd78-1c8ff44882e1" (UID: "d6a3881a-9e0a-4549-bd78-1c8ff44882e1"). InnerVolumeSpecName "kube-api-access-bhntj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.159600 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d6a3881a-9e0a-4549-bd78-1c8ff44882e1" (UID: "d6a3881a-9e0a-4549-bd78-1c8ff44882e1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.253911 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-config\") pod \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.253947 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-client-ca\") pod \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.254051 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5gbh\" (UniqueName: \"kubernetes.io/projected/3ead2cc5-e399-4a3f-be5e-99199fd476f3-kube-api-access-p5gbh\") pod \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.254085 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ead2cc5-e399-4a3f-be5e-99199fd476f3-serving-cert\") pod \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.254142 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-proxy-ca-bundles\") pod \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\" (UID: \"3ead2cc5-e399-4a3f-be5e-99199fd476f3\") " Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.254377 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdf6a706-97c8-4041-b59d-d6ae9e899839-config\") pod \"route-controller-manager-79b859bb8c-bcn5j\" (UID: \"cdf6a706-97c8-4041-b59d-d6ae9e899839\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.254404 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdf6a706-97c8-4041-b59d-d6ae9e899839-client-ca\") pod \"route-controller-manager-79b859bb8c-bcn5j\" (UID: \"cdf6a706-97c8-4041-b59d-d6ae9e899839\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.254424 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fms6g\" (UniqueName: \"kubernetes.io/projected/cdf6a706-97c8-4041-b59d-d6ae9e899839-kube-api-access-fms6g\") pod \"route-controller-manager-79b859bb8c-bcn5j\" (UID: \"cdf6a706-97c8-4041-b59d-d6ae9e899839\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.254452 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdf6a706-97c8-4041-b59d-d6ae9e899839-serving-cert\") pod \"route-controller-manager-79b859bb8c-bcn5j\" (UID: \"cdf6a706-97c8-4041-b59d-d6ae9e899839\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.254511 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.254543 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.254554 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhntj\" (UniqueName: \"kubernetes.io/projected/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-kube-api-access-bhntj\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.254563 4783 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6a3881a-9e0a-4549-bd78-1c8ff44882e1-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.255117 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-config" (OuterVolumeSpecName: "config") pod "3ead2cc5-e399-4a3f-be5e-99199fd476f3" (UID: "3ead2cc5-e399-4a3f-be5e-99199fd476f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.255216 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-client-ca" (OuterVolumeSpecName: "client-ca") pod "3ead2cc5-e399-4a3f-be5e-99199fd476f3" (UID: "3ead2cc5-e399-4a3f-be5e-99199fd476f3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.255363 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3ead2cc5-e399-4a3f-be5e-99199fd476f3" (UID: "3ead2cc5-e399-4a3f-be5e-99199fd476f3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.257053 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ead2cc5-e399-4a3f-be5e-99199fd476f3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3ead2cc5-e399-4a3f-be5e-99199fd476f3" (UID: "3ead2cc5-e399-4a3f-be5e-99199fd476f3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.257196 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ead2cc5-e399-4a3f-be5e-99199fd476f3-kube-api-access-p5gbh" (OuterVolumeSpecName: "kube-api-access-p5gbh") pod "3ead2cc5-e399-4a3f-be5e-99199fd476f3" (UID: "3ead2cc5-e399-4a3f-be5e-99199fd476f3"). InnerVolumeSpecName "kube-api-access-p5gbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.355982 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdf6a706-97c8-4041-b59d-d6ae9e899839-config\") pod \"route-controller-manager-79b859bb8c-bcn5j\" (UID: \"cdf6a706-97c8-4041-b59d-d6ae9e899839\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.356021 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdf6a706-97c8-4041-b59d-d6ae9e899839-client-ca\") pod \"route-controller-manager-79b859bb8c-bcn5j\" (UID: \"cdf6a706-97c8-4041-b59d-d6ae9e899839\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.356044 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fms6g\" (UniqueName: \"kubernetes.io/projected/cdf6a706-97c8-4041-b59d-d6ae9e899839-kube-api-access-fms6g\") pod \"route-controller-manager-79b859bb8c-bcn5j\" (UID: \"cdf6a706-97c8-4041-b59d-d6ae9e899839\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.356078 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdf6a706-97c8-4041-b59d-d6ae9e899839-serving-cert\") pod \"route-controller-manager-79b859bb8c-bcn5j\" (UID: \"cdf6a706-97c8-4041-b59d-d6ae9e899839\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.356154 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5gbh\" (UniqueName: \"kubernetes.io/projected/3ead2cc5-e399-4a3f-be5e-99199fd476f3-kube-api-access-p5gbh\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.356182 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ead2cc5-e399-4a3f-be5e-99199fd476f3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.356195 4783 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.356205 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.356214 4783 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ead2cc5-e399-4a3f-be5e-99199fd476f3-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.357255 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdf6a706-97c8-4041-b59d-d6ae9e899839-client-ca\") pod \"route-controller-manager-79b859bb8c-bcn5j\" (UID: \"cdf6a706-97c8-4041-b59d-d6ae9e899839\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.357446 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdf6a706-97c8-4041-b59d-d6ae9e899839-config\") pod \"route-controller-manager-79b859bb8c-bcn5j\" (UID: \"cdf6a706-97c8-4041-b59d-d6ae9e899839\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.359668 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdf6a706-97c8-4041-b59d-d6ae9e899839-serving-cert\") pod \"route-controller-manager-79b859bb8c-bcn5j\" (UID: \"cdf6a706-97c8-4041-b59d-d6ae9e899839\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.370034 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fms6g\" (UniqueName: \"kubernetes.io/projected/cdf6a706-97c8-4041-b59d-d6ae9e899839-kube-api-access-fms6g\") pod \"route-controller-manager-79b859bb8c-bcn5j\" (UID: \"cdf6a706-97c8-4041-b59d-d6ae9e899839\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.422348 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.593179 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j"] Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.710970 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" event={"ID":"cdf6a706-97c8-4041-b59d-d6ae9e899839","Type":"ContainerStarted","Data":"fd2f34e704087b0afccc7786137b4503497240e8992abef36a46df330f0b70ff"} Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.711037 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" event={"ID":"cdf6a706-97c8-4041-b59d-d6ae9e899839","Type":"ContainerStarted","Data":"8da94d70428a08b35ef5908107cf7e2a927f606ac1a6d8266bee9baf8119a38f"} Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.711246 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.713408 4783 generic.go:334] "Generic (PLEG): container finished" podID="d6a3881a-9e0a-4549-bd78-1c8ff44882e1" containerID="e164d303c79262b6d6573526cf84b4de1bfc0e27ab168ac1b8d0c076f6d25c40" exitCode=0 Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.713546 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.714198 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" event={"ID":"d6a3881a-9e0a-4549-bd78-1c8ff44882e1","Type":"ContainerDied","Data":"e164d303c79262b6d6573526cf84b4de1bfc0e27ab168ac1b8d0c076f6d25c40"} Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.714253 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s" event={"ID":"d6a3881a-9e0a-4549-bd78-1c8ff44882e1","Type":"ContainerDied","Data":"7d310f69d4d41dfb310a1db72bc1f6fd8f2fd1b570192d92846ab99029b772fe"} Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.714286 4783 scope.go:117] "RemoveContainer" containerID="e164d303c79262b6d6573526cf84b4de1bfc0e27ab168ac1b8d0c076f6d25c40" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.714286 4783 patch_prober.go:28] interesting pod/route-controller-manager-79b859bb8c-bcn5j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.714326 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" podUID="cdf6a706-97c8-4041-b59d-d6ae9e899839" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.716476 4783 generic.go:334] "Generic (PLEG): container finished" podID="3ead2cc5-e399-4a3f-be5e-99199fd476f3" containerID="61a2923922aad2d873d8748d972f8587092b2ec9175de98140e799b7c7848d24" exitCode=0 Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.716514 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" event={"ID":"3ead2cc5-e399-4a3f-be5e-99199fd476f3","Type":"ContainerDied","Data":"61a2923922aad2d873d8748d972f8587092b2ec9175de98140e799b7c7848d24"} Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.716543 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" event={"ID":"3ead2cc5-e399-4a3f-be5e-99199fd476f3","Type":"ContainerDied","Data":"bc2c78be0e97d63b6537a16d975d6e96117dd2858cfa4255ed23db8ceae68d17"} Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.716606 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.727554 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" podStartSLOduration=2.727539926 podStartE2EDuration="2.727539926s" podCreationTimestamp="2026-01-31 09:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:10:06.72609372 +0000 UTC m=+317.394777188" watchObservedRunningTime="2026-01-31 09:10:06.727539926 +0000 UTC m=+317.396223394" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.732935 4783 scope.go:117] "RemoveContainer" containerID="e164d303c79262b6d6573526cf84b4de1bfc0e27ab168ac1b8d0c076f6d25c40" Jan 31 09:10:06 crc kubenswrapper[4783]: E0131 09:10:06.736430 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e164d303c79262b6d6573526cf84b4de1bfc0e27ab168ac1b8d0c076f6d25c40\": container with ID starting with e164d303c79262b6d6573526cf84b4de1bfc0e27ab168ac1b8d0c076f6d25c40 not found: ID does not exist" containerID="e164d303c79262b6d6573526cf84b4de1bfc0e27ab168ac1b8d0c076f6d25c40" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.736485 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e164d303c79262b6d6573526cf84b4de1bfc0e27ab168ac1b8d0c076f6d25c40"} err="failed to get container status \"e164d303c79262b6d6573526cf84b4de1bfc0e27ab168ac1b8d0c076f6d25c40\": rpc error: code = NotFound desc = could not find container \"e164d303c79262b6d6573526cf84b4de1bfc0e27ab168ac1b8d0c076f6d25c40\": container with ID starting with e164d303c79262b6d6573526cf84b4de1bfc0e27ab168ac1b8d0c076f6d25c40 not found: ID does not exist" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.736522 4783 scope.go:117] "RemoveContainer" containerID="61a2923922aad2d873d8748d972f8587092b2ec9175de98140e799b7c7848d24" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.747069 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s"] Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.749498 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c457667dd-zdc6s"] Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.752327 4783 scope.go:117] "RemoveContainer" containerID="61a2923922aad2d873d8748d972f8587092b2ec9175de98140e799b7c7848d24" Jan 31 09:10:06 crc kubenswrapper[4783]: E0131 09:10:06.752756 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a2923922aad2d873d8748d972f8587092b2ec9175de98140e799b7c7848d24\": container with ID starting with 61a2923922aad2d873d8748d972f8587092b2ec9175de98140e799b7c7848d24 not found: ID does not exist" containerID="61a2923922aad2d873d8748d972f8587092b2ec9175de98140e799b7c7848d24" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.752813 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a2923922aad2d873d8748d972f8587092b2ec9175de98140e799b7c7848d24"} err="failed to get container status \"61a2923922aad2d873d8748d972f8587092b2ec9175de98140e799b7c7848d24\": rpc error: code = NotFound desc = could not find container \"61a2923922aad2d873d8748d972f8587092b2ec9175de98140e799b7c7848d24\": container with ID starting with 61a2923922aad2d873d8748d972f8587092b2ec9175de98140e799b7c7848d24 not found: ID does not exist" Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.754281 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c"] Jan 31 09:10:06 crc kubenswrapper[4783]: I0131 09:10:06.756635 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5574b8fdb4-s5q4c"] Jan 31 09:10:07 crc kubenswrapper[4783]: I0131 09:10:07.653479 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ead2cc5-e399-4a3f-be5e-99199fd476f3" path="/var/lib/kubelet/pods/3ead2cc5-e399-4a3f-be5e-99199fd476f3/volumes" Jan 31 09:10:07 crc kubenswrapper[4783]: I0131 09:10:07.654478 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6a3881a-9e0a-4549-bd78-1c8ff44882e1" path="/var/lib/kubelet/pods/d6a3881a-9e0a-4549-bd78-1c8ff44882e1/volumes" Jan 31 09:10:07 crc kubenswrapper[4783]: I0131 09:10:07.729211 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:07 crc kubenswrapper[4783]: I0131 09:10:07.814629 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j"] Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.020269 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-594ddbd99c-869pc"] Jan 31 09:10:09 crc kubenswrapper[4783]: E0131 09:10:09.020809 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ead2cc5-e399-4a3f-be5e-99199fd476f3" containerName="controller-manager" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.020822 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ead2cc5-e399-4a3f-be5e-99199fd476f3" containerName="controller-manager" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.020921 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ead2cc5-e399-4a3f-be5e-99199fd476f3" containerName="controller-manager" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.021448 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.023519 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.024203 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.024810 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.024840 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.025836 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.025876 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.028965 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.032036 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-594ddbd99c-869pc"] Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.195135 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74d99d06-e851-46de-81f4-5bf3c9fe984b-proxy-ca-bundles\") pod \"controller-manager-594ddbd99c-869pc\" (UID: \"74d99d06-e851-46de-81f4-5bf3c9fe984b\") " pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.195324 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74d99d06-e851-46de-81f4-5bf3c9fe984b-client-ca\") pod \"controller-manager-594ddbd99c-869pc\" (UID: \"74d99d06-e851-46de-81f4-5bf3c9fe984b\") " pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.195436 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74d99d06-e851-46de-81f4-5bf3c9fe984b-config\") pod \"controller-manager-594ddbd99c-869pc\" (UID: \"74d99d06-e851-46de-81f4-5bf3c9fe984b\") " pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.195513 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdz7d\" (UniqueName: \"kubernetes.io/projected/74d99d06-e851-46de-81f4-5bf3c9fe984b-kube-api-access-wdz7d\") pod \"controller-manager-594ddbd99c-869pc\" (UID: \"74d99d06-e851-46de-81f4-5bf3c9fe984b\") " pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.195570 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74d99d06-e851-46de-81f4-5bf3c9fe984b-serving-cert\") pod \"controller-manager-594ddbd99c-869pc\" (UID: \"74d99d06-e851-46de-81f4-5bf3c9fe984b\") " pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.296459 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74d99d06-e851-46de-81f4-5bf3c9fe984b-config\") pod \"controller-manager-594ddbd99c-869pc\" (UID: \"74d99d06-e851-46de-81f4-5bf3c9fe984b\") " pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.296522 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdz7d\" (UniqueName: \"kubernetes.io/projected/74d99d06-e851-46de-81f4-5bf3c9fe984b-kube-api-access-wdz7d\") pod \"controller-manager-594ddbd99c-869pc\" (UID: \"74d99d06-e851-46de-81f4-5bf3c9fe984b\") " pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.296562 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74d99d06-e851-46de-81f4-5bf3c9fe984b-serving-cert\") pod \"controller-manager-594ddbd99c-869pc\" (UID: \"74d99d06-e851-46de-81f4-5bf3c9fe984b\") " pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.296601 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74d99d06-e851-46de-81f4-5bf3c9fe984b-proxy-ca-bundles\") pod \"controller-manager-594ddbd99c-869pc\" (UID: \"74d99d06-e851-46de-81f4-5bf3c9fe984b\") " pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.296628 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74d99d06-e851-46de-81f4-5bf3c9fe984b-client-ca\") pod \"controller-manager-594ddbd99c-869pc\" (UID: \"74d99d06-e851-46de-81f4-5bf3c9fe984b\") " pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.297553 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74d99d06-e851-46de-81f4-5bf3c9fe984b-client-ca\") pod \"controller-manager-594ddbd99c-869pc\" (UID: \"74d99d06-e851-46de-81f4-5bf3c9fe984b\") " pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.298369 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74d99d06-e851-46de-81f4-5bf3c9fe984b-config\") pod \"controller-manager-594ddbd99c-869pc\" (UID: \"74d99d06-e851-46de-81f4-5bf3c9fe984b\") " pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.299325 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74d99d06-e851-46de-81f4-5bf3c9fe984b-proxy-ca-bundles\") pod \"controller-manager-594ddbd99c-869pc\" (UID: \"74d99d06-e851-46de-81f4-5bf3c9fe984b\") " pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.304951 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74d99d06-e851-46de-81f4-5bf3c9fe984b-serving-cert\") pod \"controller-manager-594ddbd99c-869pc\" (UID: \"74d99d06-e851-46de-81f4-5bf3c9fe984b\") " pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.311406 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdz7d\" (UniqueName: \"kubernetes.io/projected/74d99d06-e851-46de-81f4-5bf3c9fe984b-kube-api-access-wdz7d\") pod \"controller-manager-594ddbd99c-869pc\" (UID: \"74d99d06-e851-46de-81f4-5bf3c9fe984b\") " pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.340327 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.508590 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-594ddbd99c-869pc"] Jan 31 09:10:09 crc kubenswrapper[4783]: W0131 09:10:09.513699 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74d99d06_e851_46de_81f4_5bf3c9fe984b.slice/crio-4950a9409bc8dc95f9b5bf88cb03dd9e7c7d428b99b8d017fbfc623647fe695e WatchSource:0}: Error finding container 4950a9409bc8dc95f9b5bf88cb03dd9e7c7d428b99b8d017fbfc623647fe695e: Status 404 returned error can't find the container with id 4950a9409bc8dc95f9b5bf88cb03dd9e7c7d428b99b8d017fbfc623647fe695e Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.736306 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" event={"ID":"74d99d06-e851-46de-81f4-5bf3c9fe984b","Type":"ContainerStarted","Data":"48c330c51dc85871613e208dab8c19bd3859a19aba7c390f39d5c864ec4cf34d"} Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.736660 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" event={"ID":"74d99d06-e851-46de-81f4-5bf3c9fe984b","Type":"ContainerStarted","Data":"4950a9409bc8dc95f9b5bf88cb03dd9e7c7d428b99b8d017fbfc623647fe695e"} Jan 31 09:10:09 crc kubenswrapper[4783]: I0131 09:10:09.736397 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" podUID="cdf6a706-97c8-4041-b59d-d6ae9e899839" containerName="route-controller-manager" containerID="cri-o://fd2f34e704087b0afccc7786137b4503497240e8992abef36a46df330f0b70ff" gracePeriod=30 Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.088981 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.102882 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" podStartSLOduration=6.102871468 podStartE2EDuration="6.102871468s" podCreationTimestamp="2026-01-31 09:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:10:09.757389271 +0000 UTC m=+320.426072739" watchObservedRunningTime="2026-01-31 09:10:10.102871468 +0000 UTC m=+320.771554936" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.107182 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk"] Jan 31 09:10:10 crc kubenswrapper[4783]: E0131 09:10:10.107512 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf6a706-97c8-4041-b59d-d6ae9e899839" containerName="route-controller-manager" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.107531 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf6a706-97c8-4041-b59d-d6ae9e899839" containerName="route-controller-manager" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.107628 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdf6a706-97c8-4041-b59d-d6ae9e899839" containerName="route-controller-manager" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.108080 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.121627 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk"] Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.208958 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdf6a706-97c8-4041-b59d-d6ae9e899839-client-ca\") pod \"cdf6a706-97c8-4041-b59d-d6ae9e899839\" (UID: \"cdf6a706-97c8-4041-b59d-d6ae9e899839\") " Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.209027 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdf6a706-97c8-4041-b59d-d6ae9e899839-config\") pod \"cdf6a706-97c8-4041-b59d-d6ae9e899839\" (UID: \"cdf6a706-97c8-4041-b59d-d6ae9e899839\") " Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.209058 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdf6a706-97c8-4041-b59d-d6ae9e899839-serving-cert\") pod \"cdf6a706-97c8-4041-b59d-d6ae9e899839\" (UID: \"cdf6a706-97c8-4041-b59d-d6ae9e899839\") " Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.209083 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fms6g\" (UniqueName: \"kubernetes.io/projected/cdf6a706-97c8-4041-b59d-d6ae9e899839-kube-api-access-fms6g\") pod \"cdf6a706-97c8-4041-b59d-d6ae9e899839\" (UID: \"cdf6a706-97c8-4041-b59d-d6ae9e899839\") " Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.209845 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf6a706-97c8-4041-b59d-d6ae9e899839-client-ca" (OuterVolumeSpecName: "client-ca") pod "cdf6a706-97c8-4041-b59d-d6ae9e899839" (UID: "cdf6a706-97c8-4041-b59d-d6ae9e899839"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.210223 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-serving-cert\") pod \"route-controller-manager-6c959d56df-zc7qk\" (UID: \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\") " pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.209877 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf6a706-97c8-4041-b59d-d6ae9e899839-config" (OuterVolumeSpecName: "config") pod "cdf6a706-97c8-4041-b59d-d6ae9e899839" (UID: "cdf6a706-97c8-4041-b59d-d6ae9e899839"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.210355 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zvls\" (UniqueName: \"kubernetes.io/projected/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-kube-api-access-8zvls\") pod \"route-controller-manager-6c959d56df-zc7qk\" (UID: \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\") " pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.210561 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-config\") pod \"route-controller-manager-6c959d56df-zc7qk\" (UID: \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\") " pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.210593 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-client-ca\") pod \"route-controller-manager-6c959d56df-zc7qk\" (UID: \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\") " pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.211005 4783 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdf6a706-97c8-4041-b59d-d6ae9e899839-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.211053 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdf6a706-97c8-4041-b59d-d6ae9e899839-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.215571 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf6a706-97c8-4041-b59d-d6ae9e899839-kube-api-access-fms6g" (OuterVolumeSpecName: "kube-api-access-fms6g") pod "cdf6a706-97c8-4041-b59d-d6ae9e899839" (UID: "cdf6a706-97c8-4041-b59d-d6ae9e899839"). InnerVolumeSpecName "kube-api-access-fms6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.215639 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf6a706-97c8-4041-b59d-d6ae9e899839-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cdf6a706-97c8-4041-b59d-d6ae9e899839" (UID: "cdf6a706-97c8-4041-b59d-d6ae9e899839"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.312673 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-serving-cert\") pod \"route-controller-manager-6c959d56df-zc7qk\" (UID: \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\") " pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.312737 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zvls\" (UniqueName: \"kubernetes.io/projected/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-kube-api-access-8zvls\") pod \"route-controller-manager-6c959d56df-zc7qk\" (UID: \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\") " pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.312784 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-config\") pod \"route-controller-manager-6c959d56df-zc7qk\" (UID: \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\") " pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.312808 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-client-ca\") pod \"route-controller-manager-6c959d56df-zc7qk\" (UID: \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\") " pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.312891 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fms6g\" (UniqueName: \"kubernetes.io/projected/cdf6a706-97c8-4041-b59d-d6ae9e899839-kube-api-access-fms6g\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.312907 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdf6a706-97c8-4041-b59d-d6ae9e899839-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.314020 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-client-ca\") pod \"route-controller-manager-6c959d56df-zc7qk\" (UID: \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\") " pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.314407 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-config\") pod \"route-controller-manager-6c959d56df-zc7qk\" (UID: \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\") " pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.316184 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-serving-cert\") pod \"route-controller-manager-6c959d56df-zc7qk\" (UID: \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\") " pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.329241 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zvls\" (UniqueName: \"kubernetes.io/projected/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-kube-api-access-8zvls\") pod \"route-controller-manager-6c959d56df-zc7qk\" (UID: \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\") " pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.421396 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.746466 4783 generic.go:334] "Generic (PLEG): container finished" podID="cdf6a706-97c8-4041-b59d-d6ae9e899839" containerID="fd2f34e704087b0afccc7786137b4503497240e8992abef36a46df330f0b70ff" exitCode=0 Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.746584 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.746652 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" event={"ID":"cdf6a706-97c8-4041-b59d-d6ae9e899839","Type":"ContainerDied","Data":"fd2f34e704087b0afccc7786137b4503497240e8992abef36a46df330f0b70ff"} Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.746704 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j" event={"ID":"cdf6a706-97c8-4041-b59d-d6ae9e899839","Type":"ContainerDied","Data":"8da94d70428a08b35ef5908107cf7e2a927f606ac1a6d8266bee9baf8119a38f"} Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.746728 4783 scope.go:117] "RemoveContainer" containerID="fd2f34e704087b0afccc7786137b4503497240e8992abef36a46df330f0b70ff" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.746981 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.751851 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-594ddbd99c-869pc" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.762056 4783 scope.go:117] "RemoveContainer" containerID="fd2f34e704087b0afccc7786137b4503497240e8992abef36a46df330f0b70ff" Jan 31 09:10:10 crc kubenswrapper[4783]: E0131 09:10:10.763446 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd2f34e704087b0afccc7786137b4503497240e8992abef36a46df330f0b70ff\": container with ID starting with fd2f34e704087b0afccc7786137b4503497240e8992abef36a46df330f0b70ff not found: ID does not exist" containerID="fd2f34e704087b0afccc7786137b4503497240e8992abef36a46df330f0b70ff" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.763510 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2f34e704087b0afccc7786137b4503497240e8992abef36a46df330f0b70ff"} err="failed to get container status \"fd2f34e704087b0afccc7786137b4503497240e8992abef36a46df330f0b70ff\": rpc error: code = NotFound desc = could not find container \"fd2f34e704087b0afccc7786137b4503497240e8992abef36a46df330f0b70ff\": container with ID starting with fd2f34e704087b0afccc7786137b4503497240e8992abef36a46df330f0b70ff not found: ID does not exist" Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.779358 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j"] Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.781848 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b859bb8c-bcn5j"] Jan 31 09:10:10 crc kubenswrapper[4783]: I0131 09:10:10.785978 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk"] Jan 31 09:10:10 crc kubenswrapper[4783]: W0131 09:10:10.792573 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf3b5f8c_7789_4f82_a3cf_e15ed6f3e4f6.slice/crio-8545ce7787ac33808558fb94f0c26d51b5391d0843904b5baabff4d03e483bc4 WatchSource:0}: Error finding container 8545ce7787ac33808558fb94f0c26d51b5391d0843904b5baabff4d03e483bc4: Status 404 returned error can't find the container with id 8545ce7787ac33808558fb94f0c26d51b5391d0843904b5baabff4d03e483bc4 Jan 31 09:10:11 crc kubenswrapper[4783]: I0131 09:10:11.653178 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdf6a706-97c8-4041-b59d-d6ae9e899839" path="/var/lib/kubelet/pods/cdf6a706-97c8-4041-b59d-d6ae9e899839/volumes" Jan 31 09:10:11 crc kubenswrapper[4783]: I0131 09:10:11.753711 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" event={"ID":"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6","Type":"ContainerStarted","Data":"ebb97bc8915412b0205ef0233ba2331730d45faac4201ba6cfc5d8f881632c7c"} Jan 31 09:10:11 crc kubenswrapper[4783]: I0131 09:10:11.753846 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" event={"ID":"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6","Type":"ContainerStarted","Data":"8545ce7787ac33808558fb94f0c26d51b5391d0843904b5baabff4d03e483bc4"} Jan 31 09:10:11 crc kubenswrapper[4783]: I0131 09:10:11.771817 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" podStartSLOduration=4.77180011 podStartE2EDuration="4.77180011s" podCreationTimestamp="2026-01-31 09:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:10:11.768963363 +0000 UTC m=+322.437646830" watchObservedRunningTime="2026-01-31 09:10:11.77180011 +0000 UTC m=+322.440483577" Jan 31 09:10:12 crc kubenswrapper[4783]: I0131 09:10:12.760032 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:12 crc kubenswrapper[4783]: I0131 09:10:12.765434 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:17 crc kubenswrapper[4783]: I0131 09:10:17.757094 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:10:17 crc kubenswrapper[4783]: I0131 09:10:17.758329 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:10:25 crc kubenswrapper[4783]: I0131 09:10:25.997467 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bx2c9"] Jan 31 09:10:25 crc kubenswrapper[4783]: I0131 09:10:25.998433 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.005594 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bx2c9"] Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.097411 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cef40e4-4261-481e-a9bd-071648a4f149-trusted-ca\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.097474 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cef40e4-4261-481e-a9bd-071648a4f149-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.097692 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cef40e4-4261-481e-a9bd-071648a4f149-registry-certificates\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.097751 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmb2t\" (UniqueName: \"kubernetes.io/projected/2cef40e4-4261-481e-a9bd-071648a4f149-kube-api-access-fmb2t\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.097832 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cef40e4-4261-481e-a9bd-071648a4f149-bound-sa-token\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.097958 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.098014 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cef40e4-4261-481e-a9bd-071648a4f149-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.098144 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cef40e4-4261-481e-a9bd-071648a4f149-registry-tls\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.120218 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.199304 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cef40e4-4261-481e-a9bd-071648a4f149-registry-tls\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.199377 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cef40e4-4261-481e-a9bd-071648a4f149-trusted-ca\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.199407 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cef40e4-4261-481e-a9bd-071648a4f149-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.199438 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cef40e4-4261-481e-a9bd-071648a4f149-registry-certificates\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.199461 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmb2t\" (UniqueName: \"kubernetes.io/projected/2cef40e4-4261-481e-a9bd-071648a4f149-kube-api-access-fmb2t\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.199485 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cef40e4-4261-481e-a9bd-071648a4f149-bound-sa-token\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.199518 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cef40e4-4261-481e-a9bd-071648a4f149-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.200380 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cef40e4-4261-481e-a9bd-071648a4f149-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.201267 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cef40e4-4261-481e-a9bd-071648a4f149-trusted-ca\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.201531 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cef40e4-4261-481e-a9bd-071648a4f149-registry-certificates\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.204827 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cef40e4-4261-481e-a9bd-071648a4f149-registry-tls\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.204838 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cef40e4-4261-481e-a9bd-071648a4f149-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.213394 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmb2t\" (UniqueName: \"kubernetes.io/projected/2cef40e4-4261-481e-a9bd-071648a4f149-kube-api-access-fmb2t\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.214071 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cef40e4-4261-481e-a9bd-071648a4f149-bound-sa-token\") pod \"image-registry-66df7c8f76-bx2c9\" (UID: \"2cef40e4-4261-481e-a9bd-071648a4f149\") " pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.322607 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.671439 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bx2c9"] Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.850564 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" event={"ID":"2cef40e4-4261-481e-a9bd-071648a4f149","Type":"ContainerStarted","Data":"6a76cd9ce70ec6b4ba850837bb1780f637084d379b85d839d04807528422a944"} Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.850622 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" event={"ID":"2cef40e4-4261-481e-a9bd-071648a4f149","Type":"ContainerStarted","Data":"d0407006fa38ec4f742bffe85aebf49f98afdd6cfff34b6318874e6939904a37"} Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.850726 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:26 crc kubenswrapper[4783]: I0131 09:10:26.868702 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" podStartSLOduration=1.8686762890000002 podStartE2EDuration="1.868676289s" podCreationTimestamp="2026-01-31 09:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:10:26.865062661 +0000 UTC m=+337.533746129" watchObservedRunningTime="2026-01-31 09:10:26.868676289 +0000 UTC m=+337.537359757" Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.280308 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk"] Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.281033 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" podUID="af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6" containerName="route-controller-manager" containerID="cri-o://ebb97bc8915412b0205ef0233ba2331730d45faac4201ba6cfc5d8f881632c7c" gracePeriod=30 Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.670750 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.776829 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zvls\" (UniqueName: \"kubernetes.io/projected/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-kube-api-access-8zvls\") pod \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\" (UID: \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\") " Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.776886 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-config\") pod \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\" (UID: \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\") " Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.776919 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-serving-cert\") pod \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\" (UID: \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\") " Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.776936 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-client-ca\") pod \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\" (UID: \"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6\") " Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.777615 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-client-ca" (OuterVolumeSpecName: "client-ca") pod "af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6" (UID: "af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.777635 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-config" (OuterVolumeSpecName: "config") pod "af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6" (UID: "af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.782405 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6" (UID: "af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.783194 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-kube-api-access-8zvls" (OuterVolumeSpecName: "kube-api-access-8zvls") pod "af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6" (UID: "af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6"). InnerVolumeSpecName "kube-api-access-8zvls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.878764 4783 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.878814 4783 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.878826 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zvls\" (UniqueName: \"kubernetes.io/projected/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-kube-api-access-8zvls\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.878872 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.922690 4783 generic.go:334] "Generic (PLEG): container finished" podID="af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6" containerID="ebb97bc8915412b0205ef0233ba2331730d45faac4201ba6cfc5d8f881632c7c" exitCode=0 Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.922770 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.922806 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" event={"ID":"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6","Type":"ContainerDied","Data":"ebb97bc8915412b0205ef0233ba2331730d45faac4201ba6cfc5d8f881632c7c"} Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.923007 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk" event={"ID":"af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6","Type":"ContainerDied","Data":"8545ce7787ac33808558fb94f0c26d51b5391d0843904b5baabff4d03e483bc4"} Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.923106 4783 scope.go:117] "RemoveContainer" containerID="ebb97bc8915412b0205ef0233ba2331730d45faac4201ba6cfc5d8f881632c7c" Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.939491 4783 scope.go:117] "RemoveContainer" containerID="ebb97bc8915412b0205ef0233ba2331730d45faac4201ba6cfc5d8f881632c7c" Jan 31 09:10:39 crc kubenswrapper[4783]: E0131 09:10:39.939835 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebb97bc8915412b0205ef0233ba2331730d45faac4201ba6cfc5d8f881632c7c\": container with ID starting with ebb97bc8915412b0205ef0233ba2331730d45faac4201ba6cfc5d8f881632c7c not found: ID does not exist" containerID="ebb97bc8915412b0205ef0233ba2331730d45faac4201ba6cfc5d8f881632c7c" Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.939926 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb97bc8915412b0205ef0233ba2331730d45faac4201ba6cfc5d8f881632c7c"} err="failed to get container status \"ebb97bc8915412b0205ef0233ba2331730d45faac4201ba6cfc5d8f881632c7c\": rpc error: code = NotFound desc = could not find container \"ebb97bc8915412b0205ef0233ba2331730d45faac4201ba6cfc5d8f881632c7c\": container with ID starting with ebb97bc8915412b0205ef0233ba2331730d45faac4201ba6cfc5d8f881632c7c not found: ID does not exist" Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.949083 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk"] Jan 31 09:10:39 crc kubenswrapper[4783]: I0131 09:10:39.951830 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c959d56df-zc7qk"] Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.040923 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2"] Jan 31 09:10:41 crc kubenswrapper[4783]: E0131 09:10:41.042407 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6" containerName="route-controller-manager" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.042672 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6" containerName="route-controller-manager" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.042823 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6" containerName="route-controller-manager" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.043404 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.046851 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.046926 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.047098 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.047318 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.049599 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.049893 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.061395 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2"] Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.194773 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/513c8381-c21d-4b57-9b0f-8bff36c4dab1-client-ca\") pod \"route-controller-manager-79b859bb8c-sb5l2\" (UID: \"513c8381-c21d-4b57-9b0f-8bff36c4dab1\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.194848 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m9c8\" (UniqueName: \"kubernetes.io/projected/513c8381-c21d-4b57-9b0f-8bff36c4dab1-kube-api-access-9m9c8\") pod \"route-controller-manager-79b859bb8c-sb5l2\" (UID: \"513c8381-c21d-4b57-9b0f-8bff36c4dab1\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.194904 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513c8381-c21d-4b57-9b0f-8bff36c4dab1-config\") pod \"route-controller-manager-79b859bb8c-sb5l2\" (UID: \"513c8381-c21d-4b57-9b0f-8bff36c4dab1\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.194994 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/513c8381-c21d-4b57-9b0f-8bff36c4dab1-serving-cert\") pod \"route-controller-manager-79b859bb8c-sb5l2\" (UID: \"513c8381-c21d-4b57-9b0f-8bff36c4dab1\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.295559 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513c8381-c21d-4b57-9b0f-8bff36c4dab1-config\") pod \"route-controller-manager-79b859bb8c-sb5l2\" (UID: \"513c8381-c21d-4b57-9b0f-8bff36c4dab1\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.295617 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/513c8381-c21d-4b57-9b0f-8bff36c4dab1-serving-cert\") pod \"route-controller-manager-79b859bb8c-sb5l2\" (UID: \"513c8381-c21d-4b57-9b0f-8bff36c4dab1\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.295649 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/513c8381-c21d-4b57-9b0f-8bff36c4dab1-client-ca\") pod \"route-controller-manager-79b859bb8c-sb5l2\" (UID: \"513c8381-c21d-4b57-9b0f-8bff36c4dab1\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.295698 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m9c8\" (UniqueName: \"kubernetes.io/projected/513c8381-c21d-4b57-9b0f-8bff36c4dab1-kube-api-access-9m9c8\") pod \"route-controller-manager-79b859bb8c-sb5l2\" (UID: \"513c8381-c21d-4b57-9b0f-8bff36c4dab1\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.297585 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/513c8381-c21d-4b57-9b0f-8bff36c4dab1-client-ca\") pod \"route-controller-manager-79b859bb8c-sb5l2\" (UID: \"513c8381-c21d-4b57-9b0f-8bff36c4dab1\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.297690 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513c8381-c21d-4b57-9b0f-8bff36c4dab1-config\") pod \"route-controller-manager-79b859bb8c-sb5l2\" (UID: \"513c8381-c21d-4b57-9b0f-8bff36c4dab1\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.303015 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/513c8381-c21d-4b57-9b0f-8bff36c4dab1-serving-cert\") pod \"route-controller-manager-79b859bb8c-sb5l2\" (UID: \"513c8381-c21d-4b57-9b0f-8bff36c4dab1\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.315942 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m9c8\" (UniqueName: \"kubernetes.io/projected/513c8381-c21d-4b57-9b0f-8bff36c4dab1-kube-api-access-9m9c8\") pod \"route-controller-manager-79b859bb8c-sb5l2\" (UID: \"513c8381-c21d-4b57-9b0f-8bff36c4dab1\") " pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.355979 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.652493 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6" path="/var/lib/kubelet/pods/af3b5f8c-7789-4f82-a3cf-e15ed6f3e4f6/volumes" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.718256 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2"] Jan 31 09:10:41 crc kubenswrapper[4783]: W0131 09:10:41.723762 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod513c8381_c21d_4b57_9b0f_8bff36c4dab1.slice/crio-b72684b0172646899aba0be563220a5216b4ca3ad1fc6f10d229ab9d50762b6c WatchSource:0}: Error finding container b72684b0172646899aba0be563220a5216b4ca3ad1fc6f10d229ab9d50762b6c: Status 404 returned error can't find the container with id b72684b0172646899aba0be563220a5216b4ca3ad1fc6f10d229ab9d50762b6c Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.937955 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" event={"ID":"513c8381-c21d-4b57-9b0f-8bff36c4dab1","Type":"ContainerStarted","Data":"14c806ca2838a0f077b558be869650faf3ca19d6f79f8df1bda4b7b61f028acc"} Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.938016 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" event={"ID":"513c8381-c21d-4b57-9b0f-8bff36c4dab1","Type":"ContainerStarted","Data":"b72684b0172646899aba0be563220a5216b4ca3ad1fc6f10d229ab9d50762b6c"} Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.938335 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" Jan 31 09:10:41 crc kubenswrapper[4783]: I0131 09:10:41.954960 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" podStartSLOduration=2.954944951 podStartE2EDuration="2.954944951s" podCreationTimestamp="2026-01-31 09:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:10:41.950316741 +0000 UTC m=+352.619000219" watchObservedRunningTime="2026-01-31 09:10:41.954944951 +0000 UTC m=+352.623628419" Jan 31 09:10:42 crc kubenswrapper[4783]: I0131 09:10:42.169201 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-79b859bb8c-sb5l2" Jan 31 09:10:46 crc kubenswrapper[4783]: I0131 09:10:46.327002 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-bx2c9" Jan 31 09:10:46 crc kubenswrapper[4783]: I0131 09:10:46.367835 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mx2nq"] Jan 31 09:10:47 crc kubenswrapper[4783]: I0131 09:10:47.756964 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:10:47 crc kubenswrapper[4783]: I0131 09:10:47.757409 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.395668 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" podUID="27f9649b-142f-44ae-9e1b-8a6d8026ddcc" containerName="registry" containerID="cri-o://290501d0b45d53b04a9f99b1f6a654539b3e61853feca6c94a306f56889e66b6" gracePeriod=30 Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.729486 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.868179 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.868259 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-ca-trust-extracted\") pod \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.868336 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-trusted-ca\") pod \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.868406 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj98b\" (UniqueName: \"kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-kube-api-access-gj98b\") pod \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.868451 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-installation-pull-secrets\") pod \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.868477 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-registry-certificates\") pod \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.869087 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-registry-tls\") pod \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.869119 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-bound-sa-token\") pod \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\" (UID: \"27f9649b-142f-44ae-9e1b-8a6d8026ddcc\") " Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.869752 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "27f9649b-142f-44ae-9e1b-8a6d8026ddcc" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.869836 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "27f9649b-142f-44ae-9e1b-8a6d8026ddcc" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.870459 4783 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.870692 4783 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.876227 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-kube-api-access-gj98b" (OuterVolumeSpecName: "kube-api-access-gj98b") pod "27f9649b-142f-44ae-9e1b-8a6d8026ddcc" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc"). InnerVolumeSpecName "kube-api-access-gj98b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.876605 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "27f9649b-142f-44ae-9e1b-8a6d8026ddcc" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.876641 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "27f9649b-142f-44ae-9e1b-8a6d8026ddcc" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.876809 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "27f9649b-142f-44ae-9e1b-8a6d8026ddcc" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.880221 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "27f9649b-142f-44ae-9e1b-8a6d8026ddcc" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.884653 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "27f9649b-142f-44ae-9e1b-8a6d8026ddcc" (UID: "27f9649b-142f-44ae-9e1b-8a6d8026ddcc"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.972197 4783 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.972225 4783 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.972237 4783 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.972247 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj98b\" (UniqueName: \"kubernetes.io/projected/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-kube-api-access-gj98b\") on node \"crc\" DevicePath \"\"" Jan 31 09:11:11 crc kubenswrapper[4783]: I0131 09:11:11.972261 4783 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/27f9649b-142f-44ae-9e1b-8a6d8026ddcc-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 31 09:11:12 crc kubenswrapper[4783]: I0131 09:11:12.099417 4783 generic.go:334] "Generic (PLEG): container finished" podID="27f9649b-142f-44ae-9e1b-8a6d8026ddcc" containerID="290501d0b45d53b04a9f99b1f6a654539b3e61853feca6c94a306f56889e66b6" exitCode=0 Jan 31 09:11:12 crc kubenswrapper[4783]: I0131 09:11:12.099495 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" Jan 31 09:11:12 crc kubenswrapper[4783]: I0131 09:11:12.099491 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" event={"ID":"27f9649b-142f-44ae-9e1b-8a6d8026ddcc","Type":"ContainerDied","Data":"290501d0b45d53b04a9f99b1f6a654539b3e61853feca6c94a306f56889e66b6"} Jan 31 09:11:12 crc kubenswrapper[4783]: I0131 09:11:12.099605 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mx2nq" event={"ID":"27f9649b-142f-44ae-9e1b-8a6d8026ddcc","Type":"ContainerDied","Data":"8ba864220b34e169768e77b918d44399857128b829714cd4b5e929c16d186eaf"} Jan 31 09:11:12 crc kubenswrapper[4783]: I0131 09:11:12.099639 4783 scope.go:117] "RemoveContainer" containerID="290501d0b45d53b04a9f99b1f6a654539b3e61853feca6c94a306f56889e66b6" Jan 31 09:11:12 crc kubenswrapper[4783]: I0131 09:11:12.114854 4783 scope.go:117] "RemoveContainer" containerID="290501d0b45d53b04a9f99b1f6a654539b3e61853feca6c94a306f56889e66b6" Jan 31 09:11:12 crc kubenswrapper[4783]: E0131 09:11:12.115224 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"290501d0b45d53b04a9f99b1f6a654539b3e61853feca6c94a306f56889e66b6\": container with ID starting with 290501d0b45d53b04a9f99b1f6a654539b3e61853feca6c94a306f56889e66b6 not found: ID does not exist" containerID="290501d0b45d53b04a9f99b1f6a654539b3e61853feca6c94a306f56889e66b6" Jan 31 09:11:12 crc kubenswrapper[4783]: I0131 09:11:12.115270 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"290501d0b45d53b04a9f99b1f6a654539b3e61853feca6c94a306f56889e66b6"} err="failed to get container status \"290501d0b45d53b04a9f99b1f6a654539b3e61853feca6c94a306f56889e66b6\": rpc error: code = NotFound desc = could not find container \"290501d0b45d53b04a9f99b1f6a654539b3e61853feca6c94a306f56889e66b6\": container with ID starting with 290501d0b45d53b04a9f99b1f6a654539b3e61853feca6c94a306f56889e66b6 not found: ID does not exist" Jan 31 09:11:12 crc kubenswrapper[4783]: I0131 09:11:12.127131 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mx2nq"] Jan 31 09:11:12 crc kubenswrapper[4783]: I0131 09:11:12.129711 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mx2nq"] Jan 31 09:11:13 crc kubenswrapper[4783]: I0131 09:11:13.651798 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27f9649b-142f-44ae-9e1b-8a6d8026ddcc" path="/var/lib/kubelet/pods/27f9649b-142f-44ae-9e1b-8a6d8026ddcc/volumes" Jan 31 09:11:17 crc kubenswrapper[4783]: I0131 09:11:17.757197 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:11:17 crc kubenswrapper[4783]: I0131 09:11:17.757622 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:11:17 crc kubenswrapper[4783]: I0131 09:11:17.757690 4783 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:11:17 crc kubenswrapper[4783]: I0131 09:11:17.758210 4783 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eb0c7fd7fa4ed1c1e3f1dc52fb6d93f057aa5a1f9ffa937b84cf1761c03b046a"} pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:11:17 crc kubenswrapper[4783]: I0131 09:11:17.758280 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" containerID="cri-o://eb0c7fd7fa4ed1c1e3f1dc52fb6d93f057aa5a1f9ffa937b84cf1761c03b046a" gracePeriod=600 Jan 31 09:11:18 crc kubenswrapper[4783]: I0131 09:11:18.137861 4783 generic.go:334] "Generic (PLEG): container finished" podID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerID="eb0c7fd7fa4ed1c1e3f1dc52fb6d93f057aa5a1f9ffa937b84cf1761c03b046a" exitCode=0 Jan 31 09:11:18 crc kubenswrapper[4783]: I0131 09:11:18.137936 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerDied","Data":"eb0c7fd7fa4ed1c1e3f1dc52fb6d93f057aa5a1f9ffa937b84cf1761c03b046a"} Jan 31 09:11:18 crc kubenswrapper[4783]: I0131 09:11:18.138289 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerStarted","Data":"fa19abb52300978825d77b73571f5c020e68f8b7df94a01ba156241b5ff00d6c"} Jan 31 09:11:18 crc kubenswrapper[4783]: I0131 09:11:18.138337 4783 scope.go:117] "RemoveContainer" containerID="e133a25f79aab86853cd12b0b22a815545834d3814cc646f8cf244d8ec1fbe5c" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.721673 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-mhfqz"] Jan 31 09:12:36 crc kubenswrapper[4783]: E0131 09:12:36.722415 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f9649b-142f-44ae-9e1b-8a6d8026ddcc" containerName="registry" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.722430 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f9649b-142f-44ae-9e1b-8a6d8026ddcc" containerName="registry" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.722521 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f9649b-142f-44ae-9e1b-8a6d8026ddcc" containerName="registry" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.722882 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mhfqz" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.724553 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.724914 4783 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-zxbdl" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.725113 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.729428 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-bvf4q"] Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.730042 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-bvf4q" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.732430 4783 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-95m94" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.734718 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-mhfqz"] Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.741134 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-bhlz2"] Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.741847 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-bhlz2" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.743101 4783 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pwljh" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.748853 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-bhlz2"] Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.762616 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-bvf4q"] Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.789429 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjbz9\" (UniqueName: \"kubernetes.io/projected/5ac717c7-d413-4765-9bf5-b0d7ad8163c6-kube-api-access-sjbz9\") pod \"cert-manager-858654f9db-bvf4q\" (UID: \"5ac717c7-d413-4765-9bf5-b0d7ad8163c6\") " pod="cert-manager/cert-manager-858654f9db-bvf4q" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.789466 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cxv2\" (UniqueName: \"kubernetes.io/projected/c0721d7b-1d48-4f0f-bdba-4e2afa8cf7dd-kube-api-access-7cxv2\") pod \"cert-manager-cainjector-cf98fcc89-mhfqz\" (UID: \"c0721d7b-1d48-4f0f-bdba-4e2afa8cf7dd\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-mhfqz" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.789551 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2m5m\" (UniqueName: \"kubernetes.io/projected/db7bedab-fb68-4a6a-887c-d2aa1a63d0ee-kube-api-access-q2m5m\") pod \"cert-manager-webhook-687f57d79b-bhlz2\" (UID: \"db7bedab-fb68-4a6a-887c-d2aa1a63d0ee\") " pod="cert-manager/cert-manager-webhook-687f57d79b-bhlz2" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.890811 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2m5m\" (UniqueName: \"kubernetes.io/projected/db7bedab-fb68-4a6a-887c-d2aa1a63d0ee-kube-api-access-q2m5m\") pod \"cert-manager-webhook-687f57d79b-bhlz2\" (UID: \"db7bedab-fb68-4a6a-887c-d2aa1a63d0ee\") " pod="cert-manager/cert-manager-webhook-687f57d79b-bhlz2" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.890927 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjbz9\" (UniqueName: \"kubernetes.io/projected/5ac717c7-d413-4765-9bf5-b0d7ad8163c6-kube-api-access-sjbz9\") pod \"cert-manager-858654f9db-bvf4q\" (UID: \"5ac717c7-d413-4765-9bf5-b0d7ad8163c6\") " pod="cert-manager/cert-manager-858654f9db-bvf4q" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.890955 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cxv2\" (UniqueName: \"kubernetes.io/projected/c0721d7b-1d48-4f0f-bdba-4e2afa8cf7dd-kube-api-access-7cxv2\") pod \"cert-manager-cainjector-cf98fcc89-mhfqz\" (UID: \"c0721d7b-1d48-4f0f-bdba-4e2afa8cf7dd\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-mhfqz" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.909418 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2m5m\" (UniqueName: \"kubernetes.io/projected/db7bedab-fb68-4a6a-887c-d2aa1a63d0ee-kube-api-access-q2m5m\") pod \"cert-manager-webhook-687f57d79b-bhlz2\" (UID: \"db7bedab-fb68-4a6a-887c-d2aa1a63d0ee\") " pod="cert-manager/cert-manager-webhook-687f57d79b-bhlz2" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.909483 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cxv2\" (UniqueName: \"kubernetes.io/projected/c0721d7b-1d48-4f0f-bdba-4e2afa8cf7dd-kube-api-access-7cxv2\") pod \"cert-manager-cainjector-cf98fcc89-mhfqz\" (UID: \"c0721d7b-1d48-4f0f-bdba-4e2afa8cf7dd\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-mhfqz" Jan 31 09:12:36 crc kubenswrapper[4783]: I0131 09:12:36.909975 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjbz9\" (UniqueName: \"kubernetes.io/projected/5ac717c7-d413-4765-9bf5-b0d7ad8163c6-kube-api-access-sjbz9\") pod \"cert-manager-858654f9db-bvf4q\" (UID: \"5ac717c7-d413-4765-9bf5-b0d7ad8163c6\") " pod="cert-manager/cert-manager-858654f9db-bvf4q" Jan 31 09:12:37 crc kubenswrapper[4783]: I0131 09:12:37.044644 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mhfqz" Jan 31 09:12:37 crc kubenswrapper[4783]: I0131 09:12:37.052356 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-bvf4q" Jan 31 09:12:37 crc kubenswrapper[4783]: I0131 09:12:37.057327 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-bhlz2" Jan 31 09:12:37 crc kubenswrapper[4783]: I0131 09:12:37.434469 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-bvf4q"] Jan 31 09:12:37 crc kubenswrapper[4783]: W0131 09:12:37.444652 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ac717c7_d413_4765_9bf5_b0d7ad8163c6.slice/crio-23b7f359fd1b011a26c50a4387c640e650dde0cff407a971a278e30a030a07fc WatchSource:0}: Error finding container 23b7f359fd1b011a26c50a4387c640e650dde0cff407a971a278e30a030a07fc: Status 404 returned error can't find the container with id 23b7f359fd1b011a26c50a4387c640e650dde0cff407a971a278e30a030a07fc Jan 31 09:12:37 crc kubenswrapper[4783]: I0131 09:12:37.448626 4783 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:12:37 crc kubenswrapper[4783]: I0131 09:12:37.469338 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-bhlz2"] Jan 31 09:12:37 crc kubenswrapper[4783]: I0131 09:12:37.472341 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-mhfqz"] Jan 31 09:12:37 crc kubenswrapper[4783]: W0131 09:12:37.472554 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb7bedab_fb68_4a6a_887c_d2aa1a63d0ee.slice/crio-9368f7e194c9a6f8d066d01c976319091e27966bcbbce41801a4d3820c18bcc6 WatchSource:0}: Error finding container 9368f7e194c9a6f8d066d01c976319091e27966bcbbce41801a4d3820c18bcc6: Status 404 returned error can't find the container with id 9368f7e194c9a6f8d066d01c976319091e27966bcbbce41801a4d3820c18bcc6 Jan 31 09:12:37 crc kubenswrapper[4783]: W0131 09:12:37.473837 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0721d7b_1d48_4f0f_bdba_4e2afa8cf7dd.slice/crio-528d1dd0bbac310bc8ad6328d2667b616647afb16ed61c19c87f2e14b884a10a WatchSource:0}: Error finding container 528d1dd0bbac310bc8ad6328d2667b616647afb16ed61c19c87f2e14b884a10a: Status 404 returned error can't find the container with id 528d1dd0bbac310bc8ad6328d2667b616647afb16ed61c19c87f2e14b884a10a Jan 31 09:12:37 crc kubenswrapper[4783]: I0131 09:12:37.561853 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mhfqz" event={"ID":"c0721d7b-1d48-4f0f-bdba-4e2afa8cf7dd","Type":"ContainerStarted","Data":"528d1dd0bbac310bc8ad6328d2667b616647afb16ed61c19c87f2e14b884a10a"} Jan 31 09:12:37 crc kubenswrapper[4783]: I0131 09:12:37.563153 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-bhlz2" event={"ID":"db7bedab-fb68-4a6a-887c-d2aa1a63d0ee","Type":"ContainerStarted","Data":"9368f7e194c9a6f8d066d01c976319091e27966bcbbce41801a4d3820c18bcc6"} Jan 31 09:12:37 crc kubenswrapper[4783]: I0131 09:12:37.564359 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-bvf4q" event={"ID":"5ac717c7-d413-4765-9bf5-b0d7ad8163c6","Type":"ContainerStarted","Data":"23b7f359fd1b011a26c50a4387c640e650dde0cff407a971a278e30a030a07fc"} Jan 31 09:12:41 crc kubenswrapper[4783]: I0131 09:12:41.584628 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mhfqz" event={"ID":"c0721d7b-1d48-4f0f-bdba-4e2afa8cf7dd","Type":"ContainerStarted","Data":"78fb44a6d068985081a53b9b67eae6657174f46dd9629ec6b2efda38f781edf7"} Jan 31 09:12:41 crc kubenswrapper[4783]: I0131 09:12:41.586724 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-bhlz2" event={"ID":"db7bedab-fb68-4a6a-887c-d2aa1a63d0ee","Type":"ContainerStarted","Data":"14309427a2cf302a9dc913f29cced71a389819825488e9050ffd18a219a92625"} Jan 31 09:12:41 crc kubenswrapper[4783]: I0131 09:12:41.586832 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-bhlz2" Jan 31 09:12:41 crc kubenswrapper[4783]: I0131 09:12:41.587872 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-bvf4q" event={"ID":"5ac717c7-d413-4765-9bf5-b0d7ad8163c6","Type":"ContainerStarted","Data":"e9e95dd997f2be85cf5a2b9d3748328eb2751ffd833b1f955738d9b27a23bb12"} Jan 31 09:12:41 crc kubenswrapper[4783]: I0131 09:12:41.598137 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mhfqz" podStartSLOduration=3.031281373 podStartE2EDuration="5.598127188s" podCreationTimestamp="2026-01-31 09:12:36 +0000 UTC" firstStartedPulling="2026-01-31 09:12:37.475898452 +0000 UTC m=+468.144581920" lastFinishedPulling="2026-01-31 09:12:40.042744267 +0000 UTC m=+470.711427735" observedRunningTime="2026-01-31 09:12:41.594642363 +0000 UTC m=+472.263325831" watchObservedRunningTime="2026-01-31 09:12:41.598127188 +0000 UTC m=+472.266810656" Jan 31 09:12:41 crc kubenswrapper[4783]: I0131 09:12:41.616384 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-bhlz2" podStartSLOduration=2.491180008 podStartE2EDuration="5.616367479s" podCreationTimestamp="2026-01-31 09:12:36 +0000 UTC" firstStartedPulling="2026-01-31 09:12:37.474800443 +0000 UTC m=+468.143483911" lastFinishedPulling="2026-01-31 09:12:40.599987913 +0000 UTC m=+471.268671382" observedRunningTime="2026-01-31 09:12:41.612541121 +0000 UTC m=+472.281224579" watchObservedRunningTime="2026-01-31 09:12:41.616367479 +0000 UTC m=+472.285050948" Jan 31 09:12:41 crc kubenswrapper[4783]: I0131 09:12:41.624916 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-bvf4q" podStartSLOduration=2.434576417 podStartE2EDuration="5.624899632s" podCreationTimestamp="2026-01-31 09:12:36 +0000 UTC" firstStartedPulling="2026-01-31 09:12:37.448396683 +0000 UTC m=+468.117080151" lastFinishedPulling="2026-01-31 09:12:40.638719898 +0000 UTC m=+471.307403366" observedRunningTime="2026-01-31 09:12:41.621703661 +0000 UTC m=+472.290387130" watchObservedRunningTime="2026-01-31 09:12:41.624899632 +0000 UTC m=+472.293583101" Jan 31 09:12:47 crc kubenswrapper[4783]: I0131 09:12:47.061472 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-bhlz2" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.370126 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vr882"] Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.370488 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovn-controller" containerID="cri-o://860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848" gracePeriod=30 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.370863 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="nbdb" containerID="cri-o://27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c" gracePeriod=30 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.370861 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe" gracePeriod=30 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.370960 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="kube-rbac-proxy-node" containerID="cri-o://7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899" gracePeriod=30 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.371047 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="sbdb" containerID="cri-o://3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb" gracePeriod=30 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.370984 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="northd" containerID="cri-o://193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc" gracePeriod=30 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.370847 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovn-acl-logging" containerID="cri-o://80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475" gracePeriod=30 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.395467 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovnkube-controller" containerID="cri-o://657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f" gracePeriod=30 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.623527 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovnkube-controller/3.log" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.625372 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8q8td_0b5ffe9c-191a-4902-8e13-6a869f158784/kube-multus/1.log" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.625801 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8q8td_0b5ffe9c-191a-4902-8e13-6a869f158784/kube-multus/0.log" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.625839 4783 generic.go:334] "Generic (PLEG): container finished" podID="0b5ffe9c-191a-4902-8e13-6a869f158784" containerID="6549acc80da6fb5d0a453b1ca130d10ba98d32d99b730e3ba9ce4cb1e68e87d7" exitCode=2 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.625940 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8q8td" event={"ID":"0b5ffe9c-191a-4902-8e13-6a869f158784","Type":"ContainerDied","Data":"6549acc80da6fb5d0a453b1ca130d10ba98d32d99b730e3ba9ce4cb1e68e87d7"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.625996 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovn-acl-logging/0.log" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.626024 4783 scope.go:117] "RemoveContainer" containerID="11edd04f505d599460c394ceb21478807ce779c7aec546bca8f27fb9bc11679c" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.626514 4783 scope.go:117] "RemoveContainer" containerID="6549acc80da6fb5d0a453b1ca130d10ba98d32d99b730e3ba9ce4cb1e68e87d7" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.626650 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovn-controller/0.log" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.627208 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.629118 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovnkube-controller/3.log" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.629244 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8q8td_openshift-multus(0b5ffe9c-191a-4902-8e13-6a869f158784)\"" pod="openshift-multus/multus-8q8td" podUID="0b5ffe9c-191a-4902-8e13-6a869f158784" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.633996 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovn-acl-logging/0.log" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.634783 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vr882_4b3d03a1-7611-470d-a402-4f40ce95a54f/ovn-controller/0.log" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635189 4783 generic.go:334] "Generic (PLEG): container finished" podID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerID="657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f" exitCode=0 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635252 4783 generic.go:334] "Generic (PLEG): container finished" podID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerID="3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb" exitCode=0 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635262 4783 generic.go:334] "Generic (PLEG): container finished" podID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerID="27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c" exitCode=0 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635272 4783 generic.go:334] "Generic (PLEG): container finished" podID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerID="193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc" exitCode=0 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635280 4783 generic.go:334] "Generic (PLEG): container finished" podID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerID="39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe" exitCode=0 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635288 4783 generic.go:334] "Generic (PLEG): container finished" podID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerID="7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899" exitCode=0 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635296 4783 generic.go:334] "Generic (PLEG): container finished" podID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerID="80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475" exitCode=143 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635304 4783 generic.go:334] "Generic (PLEG): container finished" podID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerID="860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848" exitCode=143 Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635356 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerDied","Data":"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635395 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerDied","Data":"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635406 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerDied","Data":"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635417 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerDied","Data":"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635427 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerDied","Data":"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635439 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerDied","Data":"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635453 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635465 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635473 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635479 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635484 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635490 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635495 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635500 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635506 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635512 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635518 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerDied","Data":"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635526 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635533 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635540 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635548 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635554 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635560 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635568 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635574 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635581 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635588 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635597 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerDied","Data":"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635607 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635614 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635619 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635624 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635629 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635635 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635644 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635649 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635654 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635659 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635666 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" event={"ID":"4b3d03a1-7611-470d-a402-4f40ce95a54f","Type":"ContainerDied","Data":"cc659e66822c3dc572ee10a57ba9f66056fddad388d51623c912080de0da03aa"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635675 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635681 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635686 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635692 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635697 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635702 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635707 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635711 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635716 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.635721 4783 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29"} Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.656083 4783 scope.go:117] "RemoveContainer" containerID="657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.675278 4783 scope.go:117] "RemoveContainer" containerID="7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.700895 4783 scope.go:117] "RemoveContainer" containerID="3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.701388 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wtx6h"] Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.701725 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="sbdb" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.701756 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="sbdb" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.701768 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovnkube-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.701777 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovnkube-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.701785 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovnkube-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.701794 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovnkube-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.701804 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovnkube-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.701810 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovnkube-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.701817 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovnkube-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.701823 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovnkube-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.701839 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="nbdb" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.701848 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="nbdb" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.701860 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="northd" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.701868 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="northd" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.701876 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="kube-rbac-proxy-ovn-metrics" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.701882 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="kube-rbac-proxy-ovn-metrics" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.701892 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovn-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.702018 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovn-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.702031 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="kubecfg-setup" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.702040 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="kubecfg-setup" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.702049 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovn-acl-logging" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.702055 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovn-acl-logging" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.702066 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="kube-rbac-proxy-node" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.702073 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="kube-rbac-proxy-node" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.702814 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovn-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.702841 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovnkube-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.702852 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovnkube-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.702859 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="northd" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.702872 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovn-acl-logging" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.702892 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="sbdb" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.702903 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovnkube-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.702910 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="kube-rbac-proxy-ovn-metrics" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.702918 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="nbdb" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.702927 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="kube-rbac-proxy-node" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.703061 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovnkube-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.703072 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovnkube-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.703239 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovnkube-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.703254 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" containerName="ovnkube-controller" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.705424 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.718005 4783 scope.go:117] "RemoveContainer" containerID="27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.729745 4783 scope.go:117] "RemoveContainer" containerID="193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.741885 4783 scope.go:117] "RemoveContainer" containerID="39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.755357 4783 scope.go:117] "RemoveContainer" containerID="7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.768661 4783 scope.go:117] "RemoveContainer" containerID="80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.784694 4783 scope.go:117] "RemoveContainer" containerID="860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.803814 4783 scope.go:117] "RemoveContainer" containerID="a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.813920 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-ovn\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.813961 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-env-overrides\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814008 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-openvswitch\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814035 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-var-lib-openvswitch\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814076 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-etc-openvswitch\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814087 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814097 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814120 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814141 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-kubelet\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814214 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-run-netns\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814270 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovnkube-config\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814300 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovn-node-metrics-cert\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814311 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814364 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovnkube-script-lib\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814448 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-systemd\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814516 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-systemd-units\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814553 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-cni-netd\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814589 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-log-socket\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814625 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-cni-bin\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814670 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-node-log\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814725 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-slash\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814748 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-run-ovn-kubernetes\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814782 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqc7s\" (UniqueName: \"kubernetes.io/projected/4b3d03a1-7611-470d-a402-4f40ce95a54f-kube-api-access-gqc7s\") pod \"4b3d03a1-7611-470d-a402-4f40ce95a54f\" (UID: \"4b3d03a1-7611-470d-a402-4f40ce95a54f\") " Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814625 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814811 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814646 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814663 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814678 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814698 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814841 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814869 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814883 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814916 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-node-log" (OuterVolumeSpecName: "node-log") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814916 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-log-socket" (OuterVolumeSpecName: "log-socket") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.814951 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-slash" (OuterVolumeSpecName: "host-slash") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815081 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815095 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-run-openvswitch\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815121 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-cni-bin\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815253 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/062ed47e-3da3-4f77-bedb-9443d7babc18-ovnkube-script-lib\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815288 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/062ed47e-3da3-4f77-bedb-9443d7babc18-ovn-node-metrics-cert\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815333 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-slash\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815354 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-run-ovn-kubernetes\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815385 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-kubelet\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815429 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-run-netns\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815444 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-systemd-units\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815464 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-run-ovn\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815508 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-log-socket\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815557 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-run-systemd\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815581 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-node-log\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815598 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/062ed47e-3da3-4f77-bedb-9443d7babc18-env-overrides\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815620 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815646 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-var-lib-openvswitch\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815706 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-etc-openvswitch\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815728 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-cni-netd\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.815755 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816090 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmpkf\" (UniqueName: \"kubernetes.io/projected/062ed47e-3da3-4f77-bedb-9443d7babc18-kube-api-access-bmpkf\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816151 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/062ed47e-3da3-4f77-bedb-9443d7babc18-ovnkube-config\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816273 4783 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-slash\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816295 4783 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816309 4783 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816329 4783 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816339 4783 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816347 4783 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816358 4783 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816366 4783 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816377 4783 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816386 4783 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816396 4783 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816410 4783 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816419 4783 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816427 4783 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816436 4783 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-log-socket\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816444 4783 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.816454 4783 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-node-log\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.819377 4783 scope.go:117] "RemoveContainer" containerID="657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.819883 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f\": container with ID starting with 657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f not found: ID does not exist" containerID="657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.819973 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f"} err="failed to get container status \"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f\": rpc error: code = NotFound desc = could not find container \"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f\": container with ID starting with 657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.820074 4783 scope.go:117] "RemoveContainer" containerID="7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.820456 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\": container with ID starting with 7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3 not found: ID does not exist" containerID="7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.820550 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3"} err="failed to get container status \"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\": rpc error: code = NotFound desc = could not find container \"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\": container with ID starting with 7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.820622 4783 scope.go:117] "RemoveContainer" containerID="3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.820819 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.820981 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\": container with ID starting with 3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb not found: ID does not exist" containerID="3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.821022 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb"} err="failed to get container status \"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\": rpc error: code = NotFound desc = could not find container \"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\": container with ID starting with 3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.821069 4783 scope.go:117] "RemoveContainer" containerID="27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.821178 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b3d03a1-7611-470d-a402-4f40ce95a54f-kube-api-access-gqc7s" (OuterVolumeSpecName: "kube-api-access-gqc7s") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "kube-api-access-gqc7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.821422 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\": container with ID starting with 27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c not found: ID does not exist" containerID="27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.821447 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c"} err="failed to get container status \"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\": rpc error: code = NotFound desc = could not find container \"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\": container with ID starting with 27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.821465 4783 scope.go:117] "RemoveContainer" containerID="193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.821847 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\": container with ID starting with 193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc not found: ID does not exist" containerID="193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.821889 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc"} err="failed to get container status \"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\": rpc error: code = NotFound desc = could not find container \"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\": container with ID starting with 193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.821918 4783 scope.go:117] "RemoveContainer" containerID="39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.822292 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\": container with ID starting with 39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe not found: ID does not exist" containerID="39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.822354 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe"} err="failed to get container status \"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\": rpc error: code = NotFound desc = could not find container \"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\": container with ID starting with 39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.822394 4783 scope.go:117] "RemoveContainer" containerID="7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.823105 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\": container with ID starting with 7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899 not found: ID does not exist" containerID="7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.823142 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899"} err="failed to get container status \"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\": rpc error: code = NotFound desc = could not find container \"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\": container with ID starting with 7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.823176 4783 scope.go:117] "RemoveContainer" containerID="80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.823468 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\": container with ID starting with 80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475 not found: ID does not exist" containerID="80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.823498 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475"} err="failed to get container status \"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\": rpc error: code = NotFound desc = could not find container \"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\": container with ID starting with 80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.823516 4783 scope.go:117] "RemoveContainer" containerID="860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.823891 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\": container with ID starting with 860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848 not found: ID does not exist" containerID="860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.823920 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848"} err="failed to get container status \"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\": rpc error: code = NotFound desc = could not find container \"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\": container with ID starting with 860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.823936 4783 scope.go:117] "RemoveContainer" containerID="a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29" Jan 31 09:12:48 crc kubenswrapper[4783]: E0131 09:12:48.824248 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\": container with ID starting with a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29 not found: ID does not exist" containerID="a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.824267 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29"} err="failed to get container status \"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\": rpc error: code = NotFound desc = could not find container \"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\": container with ID starting with a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.824281 4783 scope.go:117] "RemoveContainer" containerID="657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.824570 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f"} err="failed to get container status \"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f\": rpc error: code = NotFound desc = could not find container \"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f\": container with ID starting with 657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.824601 4783 scope.go:117] "RemoveContainer" containerID="7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.824909 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3"} err="failed to get container status \"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\": rpc error: code = NotFound desc = could not find container \"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\": container with ID starting with 7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.824939 4783 scope.go:117] "RemoveContainer" containerID="3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.825235 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb"} err="failed to get container status \"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\": rpc error: code = NotFound desc = could not find container \"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\": container with ID starting with 3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.825261 4783 scope.go:117] "RemoveContainer" containerID="27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.825534 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c"} err="failed to get container status \"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\": rpc error: code = NotFound desc = could not find container \"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\": container with ID starting with 27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.825558 4783 scope.go:117] "RemoveContainer" containerID="193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.825787 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc"} err="failed to get container status \"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\": rpc error: code = NotFound desc = could not find container \"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\": container with ID starting with 193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.825811 4783 scope.go:117] "RemoveContainer" containerID="39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.826066 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe"} err="failed to get container status \"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\": rpc error: code = NotFound desc = could not find container \"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\": container with ID starting with 39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.826094 4783 scope.go:117] "RemoveContainer" containerID="7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.826385 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899"} err="failed to get container status \"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\": rpc error: code = NotFound desc = could not find container \"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\": container with ID starting with 7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.826413 4783 scope.go:117] "RemoveContainer" containerID="80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.826651 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475"} err="failed to get container status \"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\": rpc error: code = NotFound desc = could not find container \"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\": container with ID starting with 80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.826677 4783 scope.go:117] "RemoveContainer" containerID="860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.826946 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848"} err="failed to get container status \"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\": rpc error: code = NotFound desc = could not find container \"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\": container with ID starting with 860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.826969 4783 scope.go:117] "RemoveContainer" containerID="a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.827294 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29"} err="failed to get container status \"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\": rpc error: code = NotFound desc = could not find container \"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\": container with ID starting with a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.827326 4783 scope.go:117] "RemoveContainer" containerID="657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.827586 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f"} err="failed to get container status \"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f\": rpc error: code = NotFound desc = could not find container \"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f\": container with ID starting with 657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.827611 4783 scope.go:117] "RemoveContainer" containerID="7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.827851 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3"} err="failed to get container status \"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\": rpc error: code = NotFound desc = could not find container \"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\": container with ID starting with 7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.827873 4783 scope.go:117] "RemoveContainer" containerID="3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.828118 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb"} err="failed to get container status \"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\": rpc error: code = NotFound desc = could not find container \"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\": container with ID starting with 3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.828141 4783 scope.go:117] "RemoveContainer" containerID="27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.828208 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "4b3d03a1-7611-470d-a402-4f40ce95a54f" (UID: "4b3d03a1-7611-470d-a402-4f40ce95a54f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.828384 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c"} err="failed to get container status \"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\": rpc error: code = NotFound desc = could not find container \"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\": container with ID starting with 27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.828408 4783 scope.go:117] "RemoveContainer" containerID="193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.828623 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc"} err="failed to get container status \"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\": rpc error: code = NotFound desc = could not find container \"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\": container with ID starting with 193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.828641 4783 scope.go:117] "RemoveContainer" containerID="39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.828886 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe"} err="failed to get container status \"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\": rpc error: code = NotFound desc = could not find container \"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\": container with ID starting with 39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.828909 4783 scope.go:117] "RemoveContainer" containerID="7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.829145 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899"} err="failed to get container status \"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\": rpc error: code = NotFound desc = could not find container \"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\": container with ID starting with 7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.829191 4783 scope.go:117] "RemoveContainer" containerID="80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.829437 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475"} err="failed to get container status \"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\": rpc error: code = NotFound desc = could not find container \"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\": container with ID starting with 80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.829462 4783 scope.go:117] "RemoveContainer" containerID="860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.829699 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848"} err="failed to get container status \"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\": rpc error: code = NotFound desc = could not find container \"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\": container with ID starting with 860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.829721 4783 scope.go:117] "RemoveContainer" containerID="a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.829942 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29"} err="failed to get container status \"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\": rpc error: code = NotFound desc = could not find container \"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\": container with ID starting with a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.829964 4783 scope.go:117] "RemoveContainer" containerID="657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.830226 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f"} err="failed to get container status \"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f\": rpc error: code = NotFound desc = could not find container \"657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f\": container with ID starting with 657ee6ca1a5a11032f6ec4d50f55b585f799094c5d69e165f4da2170f3e6d25f not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.830250 4783 scope.go:117] "RemoveContainer" containerID="7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.830478 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3"} err="failed to get container status \"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\": rpc error: code = NotFound desc = could not find container \"7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3\": container with ID starting with 7ae9c66155f5155af06802e2ab64ced58e4777c77c5c37a65d2189ce87c4e0f3 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.830499 4783 scope.go:117] "RemoveContainer" containerID="3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.830716 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb"} err="failed to get container status \"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\": rpc error: code = NotFound desc = could not find container \"3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb\": container with ID starting with 3f75cec632072ead35e8b933bc771ef6dddd1f7899265809d56e0ce9cb772abb not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.830738 4783 scope.go:117] "RemoveContainer" containerID="27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.830958 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c"} err="failed to get container status \"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\": rpc error: code = NotFound desc = could not find container \"27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c\": container with ID starting with 27d1bce8dd1252ffed289cf70e9709e881d15d8cc93b6b04dafac369f661a60c not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.830977 4783 scope.go:117] "RemoveContainer" containerID="193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.831218 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc"} err="failed to get container status \"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\": rpc error: code = NotFound desc = could not find container \"193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc\": container with ID starting with 193dc82e72f029bd984a09e95955eccd31db3ff9766b619b47c0a2c0c9fc3fcc not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.831238 4783 scope.go:117] "RemoveContainer" containerID="39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.831492 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe"} err="failed to get container status \"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\": rpc error: code = NotFound desc = could not find container \"39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe\": container with ID starting with 39b0f6daa67c1df6c158fa17be2e9d37f7140786519abcc43ec3681418883bfe not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.831515 4783 scope.go:117] "RemoveContainer" containerID="7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.831748 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899"} err="failed to get container status \"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\": rpc error: code = NotFound desc = could not find container \"7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899\": container with ID starting with 7aeaa9cba55dc6d76c5f3634729178f53d0f384602acf413028e6f51d353b899 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.831765 4783 scope.go:117] "RemoveContainer" containerID="80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.832005 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475"} err="failed to get container status \"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\": rpc error: code = NotFound desc = could not find container \"80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475\": container with ID starting with 80d8eaf1c75e4565ce4b845356c56b690716a80c181522ed24aba6681561b475 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.832025 4783 scope.go:117] "RemoveContainer" containerID="860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.832288 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848"} err="failed to get container status \"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\": rpc error: code = NotFound desc = could not find container \"860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848\": container with ID starting with 860f9f9692cdbf7761a8b1fbb2b53ad81a0b98ffbabfa885614078b3093e8848 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.832310 4783 scope.go:117] "RemoveContainer" containerID="a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.832567 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29"} err="failed to get container status \"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\": rpc error: code = NotFound desc = could not find container \"a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29\": container with ID starting with a47112de2b22e721a35344b5a062170397bca0e7904b3309f7da3fc9cf120a29 not found: ID does not exist" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.916983 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-run-systemd\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917040 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-node-log\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917064 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/062ed47e-3da3-4f77-bedb-9443d7babc18-env-overrides\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917105 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-run-systemd\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917109 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917140 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917184 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-node-log\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917245 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-var-lib-openvswitch\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917339 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-etc-openvswitch\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917366 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-cni-netd\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917454 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmpkf\" (UniqueName: \"kubernetes.io/projected/062ed47e-3da3-4f77-bedb-9443d7babc18-kube-api-access-bmpkf\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917474 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/062ed47e-3da3-4f77-bedb-9443d7babc18-ovnkube-config\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917509 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-run-openvswitch\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917526 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-cni-bin\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917593 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/062ed47e-3da3-4f77-bedb-9443d7babc18-ovnkube-script-lib\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917625 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/062ed47e-3da3-4f77-bedb-9443d7babc18-ovn-node-metrics-cert\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917661 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-slash\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917681 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-run-ovn-kubernetes\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917719 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-kubelet\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917757 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-run-netns\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917777 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-systemd-units\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917794 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-run-ovn\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917839 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-log-socket\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917895 4783 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b3d03a1-7611-470d-a402-4f40ce95a54f-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917930 4783 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b3d03a1-7611-470d-a402-4f40ce95a54f-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917943 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqc7s\" (UniqueName: \"kubernetes.io/projected/4b3d03a1-7611-470d-a402-4f40ce95a54f-kube-api-access-gqc7s\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917977 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-log-socket\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.918005 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-slash\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.918027 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-run-netns\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.918049 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-systemd-units\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.918059 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-kubelet\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.918070 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-run-ovn-kubernetes\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.918072 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-run-ovn\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.917684 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/062ed47e-3da3-4f77-bedb-9443d7babc18-env-overrides\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.918103 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-run-openvswitch\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.918106 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-cni-bin\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.918136 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/062ed47e-3da3-4f77-bedb-9443d7babc18-ovnkube-config\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.918200 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-var-lib-openvswitch\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.918217 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-host-cni-netd\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.918242 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/062ed47e-3da3-4f77-bedb-9443d7babc18-etc-openvswitch\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.918689 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/062ed47e-3da3-4f77-bedb-9443d7babc18-ovnkube-script-lib\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.920083 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/062ed47e-3da3-4f77-bedb-9443d7babc18-ovn-node-metrics-cert\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:48 crc kubenswrapper[4783]: I0131 09:12:48.931744 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmpkf\" (UniqueName: \"kubernetes.io/projected/062ed47e-3da3-4f77-bedb-9443d7babc18-kube-api-access-bmpkf\") pod \"ovnkube-node-wtx6h\" (UID: \"062ed47e-3da3-4f77-bedb-9443d7babc18\") " pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:49 crc kubenswrapper[4783]: I0131 09:12:49.017458 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:49 crc kubenswrapper[4783]: W0131 09:12:49.034760 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod062ed47e_3da3_4f77_bedb_9443d7babc18.slice/crio-183cebc4cd43188dd9045a954a099b370e9df060e2568fc8bebf61d8d82734c1 WatchSource:0}: Error finding container 183cebc4cd43188dd9045a954a099b370e9df060e2568fc8bebf61d8d82734c1: Status 404 returned error can't find the container with id 183cebc4cd43188dd9045a954a099b370e9df060e2568fc8bebf61d8d82734c1 Jan 31 09:12:49 crc kubenswrapper[4783]: I0131 09:12:49.642602 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vr882" Jan 31 09:12:49 crc kubenswrapper[4783]: I0131 09:12:49.649647 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8q8td_0b5ffe9c-191a-4902-8e13-6a869f158784/kube-multus/1.log" Jan 31 09:12:49 crc kubenswrapper[4783]: I0131 09:12:49.653008 4783 generic.go:334] "Generic (PLEG): container finished" podID="062ed47e-3da3-4f77-bedb-9443d7babc18" containerID="e86d0d28c39b591456211a17f33b5622899061b5fbd88d8ef4c84718b87baae7" exitCode=0 Jan 31 09:12:49 crc kubenswrapper[4783]: I0131 09:12:49.663389 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" event={"ID":"062ed47e-3da3-4f77-bedb-9443d7babc18","Type":"ContainerDied","Data":"e86d0d28c39b591456211a17f33b5622899061b5fbd88d8ef4c84718b87baae7"} Jan 31 09:12:49 crc kubenswrapper[4783]: I0131 09:12:49.663418 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" event={"ID":"062ed47e-3da3-4f77-bedb-9443d7babc18","Type":"ContainerStarted","Data":"183cebc4cd43188dd9045a954a099b370e9df060e2568fc8bebf61d8d82734c1"} Jan 31 09:12:49 crc kubenswrapper[4783]: I0131 09:12:49.699199 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vr882"] Jan 31 09:12:49 crc kubenswrapper[4783]: I0131 09:12:49.703393 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vr882"] Jan 31 09:12:50 crc kubenswrapper[4783]: I0131 09:12:50.663585 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" event={"ID":"062ed47e-3da3-4f77-bedb-9443d7babc18","Type":"ContainerStarted","Data":"051d4cca419b8ec45c1a212870d6cc1c55173dd3cd6ec20ab1e099302f2ddcf5"} Jan 31 09:12:50 crc kubenswrapper[4783]: I0131 09:12:50.663912 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" event={"ID":"062ed47e-3da3-4f77-bedb-9443d7babc18","Type":"ContainerStarted","Data":"161b6ad98d6ecb6f384f2aacdb265ab8abacc9857847c56902d95b518d303266"} Jan 31 09:12:50 crc kubenswrapper[4783]: I0131 09:12:50.663927 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" event={"ID":"062ed47e-3da3-4f77-bedb-9443d7babc18","Type":"ContainerStarted","Data":"31fcd1c73d831c71105831ea9cf03e73c731e2f83de64cda84a2e0d5eb863b94"} Jan 31 09:12:50 crc kubenswrapper[4783]: I0131 09:12:50.663937 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" event={"ID":"062ed47e-3da3-4f77-bedb-9443d7babc18","Type":"ContainerStarted","Data":"1ad6800b49c7fda9638e923a0961a985805773ada88c7feedfb819f54d04ece4"} Jan 31 09:12:50 crc kubenswrapper[4783]: I0131 09:12:50.663949 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" event={"ID":"062ed47e-3da3-4f77-bedb-9443d7babc18","Type":"ContainerStarted","Data":"bfdeb0fdbf22f02f7d1ffae9374f4619a4b080b9b03c54dd5ff74c2f9e79a220"} Jan 31 09:12:50 crc kubenswrapper[4783]: I0131 09:12:50.663961 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" event={"ID":"062ed47e-3da3-4f77-bedb-9443d7babc18","Type":"ContainerStarted","Data":"6493276a7f10be33ddbfdcb274539625dc4c0f8a6afd936127eadb9076600089"} Jan 31 09:12:51 crc kubenswrapper[4783]: I0131 09:12:51.653913 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b3d03a1-7611-470d-a402-4f40ce95a54f" path="/var/lib/kubelet/pods/4b3d03a1-7611-470d-a402-4f40ce95a54f/volumes" Jan 31 09:12:52 crc kubenswrapper[4783]: I0131 09:12:52.679607 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" event={"ID":"062ed47e-3da3-4f77-bedb-9443d7babc18","Type":"ContainerStarted","Data":"f192aa7ddc090285f2a95ab48f4518f50b6d9c6bddde50a66948e7bd7b05a784"} Jan 31 09:12:54 crc kubenswrapper[4783]: I0131 09:12:54.701374 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" event={"ID":"062ed47e-3da3-4f77-bedb-9443d7babc18","Type":"ContainerStarted","Data":"dd0a25f6e254e4a31873a8a4552cb48267b3dec4f5642701d31c06b7ed19bd1f"} Jan 31 09:12:54 crc kubenswrapper[4783]: I0131 09:12:54.701793 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:54 crc kubenswrapper[4783]: I0131 09:12:54.701831 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:54 crc kubenswrapper[4783]: I0131 09:12:54.733572 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" podStartSLOduration=6.733551989 podStartE2EDuration="6.733551989s" podCreationTimestamp="2026-01-31 09:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:12:54.73034685 +0000 UTC m=+485.399030318" watchObservedRunningTime="2026-01-31 09:12:54.733551989 +0000 UTC m=+485.402235456" Jan 31 09:12:54 crc kubenswrapper[4783]: I0131 09:12:54.742401 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:55 crc kubenswrapper[4783]: I0131 09:12:55.707712 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:55 crc kubenswrapper[4783]: I0131 09:12:55.735395 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:12:59 crc kubenswrapper[4783]: I0131 09:12:59.647936 4783 scope.go:117] "RemoveContainer" containerID="6549acc80da6fb5d0a453b1ca130d10ba98d32d99b730e3ba9ce4cb1e68e87d7" Jan 31 09:13:00 crc kubenswrapper[4783]: I0131 09:13:00.731050 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8q8td_0b5ffe9c-191a-4902-8e13-6a869f158784/kube-multus/1.log" Jan 31 09:13:00 crc kubenswrapper[4783]: I0131 09:13:00.731509 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8q8td" event={"ID":"0b5ffe9c-191a-4902-8e13-6a869f158784","Type":"ContainerStarted","Data":"a820b5571f1db6d5541e9cf9730c636b073ba43b54259b98d02f21982d917547"} Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.171056 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv"] Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.172862 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.176013 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.178764 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv"] Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.248154 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a19e1f79-11c0-430c-9bea-96c2878fde55-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv\" (UID: \"a19e1f79-11c0-430c-9bea-96c2878fde55\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.248244 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftr7z\" (UniqueName: \"kubernetes.io/projected/a19e1f79-11c0-430c-9bea-96c2878fde55-kube-api-access-ftr7z\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv\" (UID: \"a19e1f79-11c0-430c-9bea-96c2878fde55\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.248305 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a19e1f79-11c0-430c-9bea-96c2878fde55-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv\" (UID: \"a19e1f79-11c0-430c-9bea-96c2878fde55\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.349989 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a19e1f79-11c0-430c-9bea-96c2878fde55-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv\" (UID: \"a19e1f79-11c0-430c-9bea-96c2878fde55\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.350086 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftr7z\" (UniqueName: \"kubernetes.io/projected/a19e1f79-11c0-430c-9bea-96c2878fde55-kube-api-access-ftr7z\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv\" (UID: \"a19e1f79-11c0-430c-9bea-96c2878fde55\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.350112 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a19e1f79-11c0-430c-9bea-96c2878fde55-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv\" (UID: \"a19e1f79-11c0-430c-9bea-96c2878fde55\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.350507 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a19e1f79-11c0-430c-9bea-96c2878fde55-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv\" (UID: \"a19e1f79-11c0-430c-9bea-96c2878fde55\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.350563 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a19e1f79-11c0-430c-9bea-96c2878fde55-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv\" (UID: \"a19e1f79-11c0-430c-9bea-96c2878fde55\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.368371 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftr7z\" (UniqueName: \"kubernetes.io/projected/a19e1f79-11c0-430c-9bea-96c2878fde55-kube-api-access-ftr7z\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv\" (UID: \"a19e1f79-11c0-430c-9bea-96c2878fde55\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.487490 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.757127 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.757420 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:13:17 crc kubenswrapper[4783]: I0131 09:13:17.839397 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv"] Jan 31 09:13:18 crc kubenswrapper[4783]: I0131 09:13:18.835231 4783 generic.go:334] "Generic (PLEG): container finished" podID="a19e1f79-11c0-430c-9bea-96c2878fde55" containerID="e0922d0818c2c3e1c1202d633b0cdebef21e78df911132e4585a89ee259387fa" exitCode=0 Jan 31 09:13:18 crc kubenswrapper[4783]: I0131 09:13:18.835348 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" event={"ID":"a19e1f79-11c0-430c-9bea-96c2878fde55","Type":"ContainerDied","Data":"e0922d0818c2c3e1c1202d633b0cdebef21e78df911132e4585a89ee259387fa"} Jan 31 09:13:18 crc kubenswrapper[4783]: I0131 09:13:18.835631 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" event={"ID":"a19e1f79-11c0-430c-9bea-96c2878fde55","Type":"ContainerStarted","Data":"4f7169156979671ec7f3fbf92f772066524e83218d46330da64d7929394e11ad"} Jan 31 09:13:19 crc kubenswrapper[4783]: I0131 09:13:19.037910 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wtx6h" Jan 31 09:13:20 crc kubenswrapper[4783]: I0131 09:13:20.854070 4783 generic.go:334] "Generic (PLEG): container finished" podID="a19e1f79-11c0-430c-9bea-96c2878fde55" containerID="1ffcccb947653eec6d1f3c053d7923bfe2051d4e046cf482463a241766cfd826" exitCode=0 Jan 31 09:13:20 crc kubenswrapper[4783]: I0131 09:13:20.854187 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" event={"ID":"a19e1f79-11c0-430c-9bea-96c2878fde55","Type":"ContainerDied","Data":"1ffcccb947653eec6d1f3c053d7923bfe2051d4e046cf482463a241766cfd826"} Jan 31 09:13:21 crc kubenswrapper[4783]: I0131 09:13:21.861600 4783 generic.go:334] "Generic (PLEG): container finished" podID="a19e1f79-11c0-430c-9bea-96c2878fde55" containerID="79c61228974d23198820d81389583a9daea3d04a0c534c6cb9f9b8492b695a7d" exitCode=0 Jan 31 09:13:21 crc kubenswrapper[4783]: I0131 09:13:21.861701 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" event={"ID":"a19e1f79-11c0-430c-9bea-96c2878fde55","Type":"ContainerDied","Data":"79c61228974d23198820d81389583a9daea3d04a0c534c6cb9f9b8492b695a7d"} Jan 31 09:13:23 crc kubenswrapper[4783]: I0131 09:13:23.055969 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" Jan 31 09:13:23 crc kubenswrapper[4783]: I0131 09:13:23.206011 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a19e1f79-11c0-430c-9bea-96c2878fde55-util\") pod \"a19e1f79-11c0-430c-9bea-96c2878fde55\" (UID: \"a19e1f79-11c0-430c-9bea-96c2878fde55\") " Jan 31 09:13:23 crc kubenswrapper[4783]: I0131 09:13:23.206124 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftr7z\" (UniqueName: \"kubernetes.io/projected/a19e1f79-11c0-430c-9bea-96c2878fde55-kube-api-access-ftr7z\") pod \"a19e1f79-11c0-430c-9bea-96c2878fde55\" (UID: \"a19e1f79-11c0-430c-9bea-96c2878fde55\") " Jan 31 09:13:23 crc kubenswrapper[4783]: I0131 09:13:23.206184 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a19e1f79-11c0-430c-9bea-96c2878fde55-bundle\") pod \"a19e1f79-11c0-430c-9bea-96c2878fde55\" (UID: \"a19e1f79-11c0-430c-9bea-96c2878fde55\") " Jan 31 09:13:23 crc kubenswrapper[4783]: I0131 09:13:23.206971 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a19e1f79-11c0-430c-9bea-96c2878fde55-bundle" (OuterVolumeSpecName: "bundle") pod "a19e1f79-11c0-430c-9bea-96c2878fde55" (UID: "a19e1f79-11c0-430c-9bea-96c2878fde55"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:13:23 crc kubenswrapper[4783]: I0131 09:13:23.211375 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a19e1f79-11c0-430c-9bea-96c2878fde55-kube-api-access-ftr7z" (OuterVolumeSpecName: "kube-api-access-ftr7z") pod "a19e1f79-11c0-430c-9bea-96c2878fde55" (UID: "a19e1f79-11c0-430c-9bea-96c2878fde55"). InnerVolumeSpecName "kube-api-access-ftr7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:13:23 crc kubenswrapper[4783]: I0131 09:13:23.265093 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a19e1f79-11c0-430c-9bea-96c2878fde55-util" (OuterVolumeSpecName: "util") pod "a19e1f79-11c0-430c-9bea-96c2878fde55" (UID: "a19e1f79-11c0-430c-9bea-96c2878fde55"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:13:23 crc kubenswrapper[4783]: I0131 09:13:23.307527 4783 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a19e1f79-11c0-430c-9bea-96c2878fde55-util\") on node \"crc\" DevicePath \"\"" Jan 31 09:13:23 crc kubenswrapper[4783]: I0131 09:13:23.307553 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftr7z\" (UniqueName: \"kubernetes.io/projected/a19e1f79-11c0-430c-9bea-96c2878fde55-kube-api-access-ftr7z\") on node \"crc\" DevicePath \"\"" Jan 31 09:13:23 crc kubenswrapper[4783]: I0131 09:13:23.307566 4783 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a19e1f79-11c0-430c-9bea-96c2878fde55-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:13:23 crc kubenswrapper[4783]: I0131 09:13:23.880210 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" event={"ID":"a19e1f79-11c0-430c-9bea-96c2878fde55","Type":"ContainerDied","Data":"4f7169156979671ec7f3fbf92f772066524e83218d46330da64d7929394e11ad"} Jan 31 09:13:23 crc kubenswrapper[4783]: I0131 09:13:23.880286 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f7169156979671ec7f3fbf92f772066524e83218d46330da64d7929394e11ad" Jan 31 09:13:23 crc kubenswrapper[4783]: I0131 09:13:23.880244 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv" Jan 31 09:13:25 crc kubenswrapper[4783]: I0131 09:13:25.060186 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-7nq22"] Jan 31 09:13:25 crc kubenswrapper[4783]: E0131 09:13:25.060685 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19e1f79-11c0-430c-9bea-96c2878fde55" containerName="pull" Jan 31 09:13:25 crc kubenswrapper[4783]: I0131 09:13:25.060698 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19e1f79-11c0-430c-9bea-96c2878fde55" containerName="pull" Jan 31 09:13:25 crc kubenswrapper[4783]: E0131 09:13:25.060711 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19e1f79-11c0-430c-9bea-96c2878fde55" containerName="util" Jan 31 09:13:25 crc kubenswrapper[4783]: I0131 09:13:25.060718 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19e1f79-11c0-430c-9bea-96c2878fde55" containerName="util" Jan 31 09:13:25 crc kubenswrapper[4783]: E0131 09:13:25.060733 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19e1f79-11c0-430c-9bea-96c2878fde55" containerName="extract" Jan 31 09:13:25 crc kubenswrapper[4783]: I0131 09:13:25.060739 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19e1f79-11c0-430c-9bea-96c2878fde55" containerName="extract" Jan 31 09:13:25 crc kubenswrapper[4783]: I0131 09:13:25.060828 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="a19e1f79-11c0-430c-9bea-96c2878fde55" containerName="extract" Jan 31 09:13:25 crc kubenswrapper[4783]: I0131 09:13:25.061251 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-7nq22" Jan 31 09:13:25 crc kubenswrapper[4783]: I0131 09:13:25.064615 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-x7bqf" Jan 31 09:13:25 crc kubenswrapper[4783]: I0131 09:13:25.071938 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 31 09:13:25 crc kubenswrapper[4783]: I0131 09:13:25.073738 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-7nq22"] Jan 31 09:13:25 crc kubenswrapper[4783]: I0131 09:13:25.074176 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 31 09:13:25 crc kubenswrapper[4783]: I0131 09:13:25.127613 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbkqg\" (UniqueName: \"kubernetes.io/projected/66e4885b-8227-4433-8208-12dad761b627-kube-api-access-sbkqg\") pod \"nmstate-operator-646758c888-7nq22\" (UID: \"66e4885b-8227-4433-8208-12dad761b627\") " pod="openshift-nmstate/nmstate-operator-646758c888-7nq22" Jan 31 09:13:25 crc kubenswrapper[4783]: I0131 09:13:25.228379 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbkqg\" (UniqueName: \"kubernetes.io/projected/66e4885b-8227-4433-8208-12dad761b627-kube-api-access-sbkqg\") pod \"nmstate-operator-646758c888-7nq22\" (UID: \"66e4885b-8227-4433-8208-12dad761b627\") " pod="openshift-nmstate/nmstate-operator-646758c888-7nq22" Jan 31 09:13:25 crc kubenswrapper[4783]: I0131 09:13:25.244713 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbkqg\" (UniqueName: \"kubernetes.io/projected/66e4885b-8227-4433-8208-12dad761b627-kube-api-access-sbkqg\") pod \"nmstate-operator-646758c888-7nq22\" (UID: \"66e4885b-8227-4433-8208-12dad761b627\") " pod="openshift-nmstate/nmstate-operator-646758c888-7nq22" Jan 31 09:13:25 crc kubenswrapper[4783]: I0131 09:13:25.373248 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-7nq22" Jan 31 09:13:25 crc kubenswrapper[4783]: I0131 09:13:25.730074 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-7nq22"] Jan 31 09:13:25 crc kubenswrapper[4783]: W0131 09:13:25.732020 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66e4885b_8227_4433_8208_12dad761b627.slice/crio-82172ead37d8c82c674cff6dd6f36071ddc0bc05ddf840600109691b609ba975 WatchSource:0}: Error finding container 82172ead37d8c82c674cff6dd6f36071ddc0bc05ddf840600109691b609ba975: Status 404 returned error can't find the container with id 82172ead37d8c82c674cff6dd6f36071ddc0bc05ddf840600109691b609ba975 Jan 31 09:13:25 crc kubenswrapper[4783]: I0131 09:13:25.891513 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-7nq22" event={"ID":"66e4885b-8227-4433-8208-12dad761b627","Type":"ContainerStarted","Data":"82172ead37d8c82c674cff6dd6f36071ddc0bc05ddf840600109691b609ba975"} Jan 31 09:13:28 crc kubenswrapper[4783]: I0131 09:13:28.907279 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-7nq22" event={"ID":"66e4885b-8227-4433-8208-12dad761b627","Type":"ContainerStarted","Data":"1f11adc3651bfef91491359aa04e255b931e48f17ace075d8ea3aba04c5b36d5"} Jan 31 09:13:28 crc kubenswrapper[4783]: I0131 09:13:28.922111 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-7nq22" podStartSLOduration=1.3386613889999999 podStartE2EDuration="3.922083761s" podCreationTimestamp="2026-01-31 09:13:25 +0000 UTC" firstStartedPulling="2026-01-31 09:13:25.735060793 +0000 UTC m=+516.403744262" lastFinishedPulling="2026-01-31 09:13:28.318483166 +0000 UTC m=+518.987166634" observedRunningTime="2026-01-31 09:13:28.918843387 +0000 UTC m=+519.587526856" watchObservedRunningTime="2026-01-31 09:13:28.922083761 +0000 UTC m=+519.590767229" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.745043 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-kw4vm"] Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.746463 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-kw4vm" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.748771 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-mtcqs" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.753376 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-2cm9k"] Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.754279 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-2cm9k" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.755832 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.755859 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-kw4vm"] Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.761655 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-vc2gq"] Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.762540 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vc2gq" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.768708 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-2cm9k"] Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.858856 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc"] Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.859652 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.861465 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.861496 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-88jv8" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.863556 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.869426 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc"] Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.887248 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6782\" (UniqueName: \"kubernetes.io/projected/fe6808ce-fbb6-4782-831b-892b074b7267-kube-api-access-x6782\") pod \"nmstate-webhook-8474b5b9d8-2cm9k\" (UID: \"fe6808ce-fbb6-4782-831b-892b074b7267\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-2cm9k" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.887300 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fe6808ce-fbb6-4782-831b-892b074b7267-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-2cm9k\" (UID: \"fe6808ce-fbb6-4782-831b-892b074b7267\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-2cm9k" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.887336 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmzds\" (UniqueName: \"kubernetes.io/projected/4b90a875-3ddc-4ba4-a62a-dd83c9de4d59-kube-api-access-nmzds\") pod \"nmstate-handler-vc2gq\" (UID: \"4b90a875-3ddc-4ba4-a62a-dd83c9de4d59\") " pod="openshift-nmstate/nmstate-handler-vc2gq" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.887357 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4b90a875-3ddc-4ba4-a62a-dd83c9de4d59-nmstate-lock\") pod \"nmstate-handler-vc2gq\" (UID: \"4b90a875-3ddc-4ba4-a62a-dd83c9de4d59\") " pod="openshift-nmstate/nmstate-handler-vc2gq" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.887385 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4b90a875-3ddc-4ba4-a62a-dd83c9de4d59-ovs-socket\") pod \"nmstate-handler-vc2gq\" (UID: \"4b90a875-3ddc-4ba4-a62a-dd83c9de4d59\") " pod="openshift-nmstate/nmstate-handler-vc2gq" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.887406 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4b90a875-3ddc-4ba4-a62a-dd83c9de4d59-dbus-socket\") pod \"nmstate-handler-vc2gq\" (UID: \"4b90a875-3ddc-4ba4-a62a-dd83c9de4d59\") " pod="openshift-nmstate/nmstate-handler-vc2gq" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.887422 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzlgh\" (UniqueName: \"kubernetes.io/projected/20903e03-98bb-4970-b9b9-9088bfbd1902-kube-api-access-dzlgh\") pod \"nmstate-metrics-54757c584b-kw4vm\" (UID: \"20903e03-98bb-4970-b9b9-9088bfbd1902\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-kw4vm" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.988396 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmzds\" (UniqueName: \"kubernetes.io/projected/4b90a875-3ddc-4ba4-a62a-dd83c9de4d59-kube-api-access-nmzds\") pod \"nmstate-handler-vc2gq\" (UID: \"4b90a875-3ddc-4ba4-a62a-dd83c9de4d59\") " pod="openshift-nmstate/nmstate-handler-vc2gq" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.988450 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4b90a875-3ddc-4ba4-a62a-dd83c9de4d59-nmstate-lock\") pod \"nmstate-handler-vc2gq\" (UID: \"4b90a875-3ddc-4ba4-a62a-dd83c9de4d59\") " pod="openshift-nmstate/nmstate-handler-vc2gq" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.988482 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4b90a875-3ddc-4ba4-a62a-dd83c9de4d59-ovs-socket\") pod \"nmstate-handler-vc2gq\" (UID: \"4b90a875-3ddc-4ba4-a62a-dd83c9de4d59\") " pod="openshift-nmstate/nmstate-handler-vc2gq" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.988498 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4b90a875-3ddc-4ba4-a62a-dd83c9de4d59-dbus-socket\") pod \"nmstate-handler-vc2gq\" (UID: \"4b90a875-3ddc-4ba4-a62a-dd83c9de4d59\") " pod="openshift-nmstate/nmstate-handler-vc2gq" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.988517 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzlgh\" (UniqueName: \"kubernetes.io/projected/20903e03-98bb-4970-b9b9-9088bfbd1902-kube-api-access-dzlgh\") pod \"nmstate-metrics-54757c584b-kw4vm\" (UID: \"20903e03-98bb-4970-b9b9-9088bfbd1902\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-kw4vm" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.988565 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dd570070-6aad-4b28-aefd-e4e2ce7e6a8c-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-58rxc\" (UID: \"dd570070-6aad-4b28-aefd-e4e2ce7e6a8c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.988585 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd570070-6aad-4b28-aefd-e4e2ce7e6a8c-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-58rxc\" (UID: \"dd570070-6aad-4b28-aefd-e4e2ce7e6a8c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.988611 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6782\" (UniqueName: \"kubernetes.io/projected/fe6808ce-fbb6-4782-831b-892b074b7267-kube-api-access-x6782\") pod \"nmstate-webhook-8474b5b9d8-2cm9k\" (UID: \"fe6808ce-fbb6-4782-831b-892b074b7267\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-2cm9k" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.988620 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4b90a875-3ddc-4ba4-a62a-dd83c9de4d59-ovs-socket\") pod \"nmstate-handler-vc2gq\" (UID: \"4b90a875-3ddc-4ba4-a62a-dd83c9de4d59\") " pod="openshift-nmstate/nmstate-handler-vc2gq" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.988637 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fe6808ce-fbb6-4782-831b-892b074b7267-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-2cm9k\" (UID: \"fe6808ce-fbb6-4782-831b-892b074b7267\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-2cm9k" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.988737 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98rxf\" (UniqueName: \"kubernetes.io/projected/dd570070-6aad-4b28-aefd-e4e2ce7e6a8c-kube-api-access-98rxf\") pod \"nmstate-console-plugin-7754f76f8b-58rxc\" (UID: \"dd570070-6aad-4b28-aefd-e4e2ce7e6a8c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.988730 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4b90a875-3ddc-4ba4-a62a-dd83c9de4d59-nmstate-lock\") pod \"nmstate-handler-vc2gq\" (UID: \"4b90a875-3ddc-4ba4-a62a-dd83c9de4d59\") " pod="openshift-nmstate/nmstate-handler-vc2gq" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.988993 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4b90a875-3ddc-4ba4-a62a-dd83c9de4d59-dbus-socket\") pod \"nmstate-handler-vc2gq\" (UID: \"4b90a875-3ddc-4ba4-a62a-dd83c9de4d59\") " pod="openshift-nmstate/nmstate-handler-vc2gq" Jan 31 09:13:29 crc kubenswrapper[4783]: I0131 09:13:29.993942 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fe6808ce-fbb6-4782-831b-892b074b7267-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-2cm9k\" (UID: \"fe6808ce-fbb6-4782-831b-892b074b7267\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-2cm9k" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.004108 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmzds\" (UniqueName: \"kubernetes.io/projected/4b90a875-3ddc-4ba4-a62a-dd83c9de4d59-kube-api-access-nmzds\") pod \"nmstate-handler-vc2gq\" (UID: \"4b90a875-3ddc-4ba4-a62a-dd83c9de4d59\") " pod="openshift-nmstate/nmstate-handler-vc2gq" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.004260 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzlgh\" (UniqueName: \"kubernetes.io/projected/20903e03-98bb-4970-b9b9-9088bfbd1902-kube-api-access-dzlgh\") pod \"nmstate-metrics-54757c584b-kw4vm\" (UID: \"20903e03-98bb-4970-b9b9-9088bfbd1902\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-kw4vm" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.009962 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6782\" (UniqueName: \"kubernetes.io/projected/fe6808ce-fbb6-4782-831b-892b074b7267-kube-api-access-x6782\") pod \"nmstate-webhook-8474b5b9d8-2cm9k\" (UID: \"fe6808ce-fbb6-4782-831b-892b074b7267\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-2cm9k" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.029936 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5894c48794-l8mmf"] Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.030888 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.041607 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5894c48794-l8mmf"] Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.068537 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-kw4vm" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.074680 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-2cm9k" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.089879 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98rxf\" (UniqueName: \"kubernetes.io/projected/dd570070-6aad-4b28-aefd-e4e2ce7e6a8c-kube-api-access-98rxf\") pod \"nmstate-console-plugin-7754f76f8b-58rxc\" (UID: \"dd570070-6aad-4b28-aefd-e4e2ce7e6a8c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.089967 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dd570070-6aad-4b28-aefd-e4e2ce7e6a8c-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-58rxc\" (UID: \"dd570070-6aad-4b28-aefd-e4e2ce7e6a8c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.089989 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd570070-6aad-4b28-aefd-e4e2ce7e6a8c-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-58rxc\" (UID: \"dd570070-6aad-4b28-aefd-e4e2ce7e6a8c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.091125 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dd570070-6aad-4b28-aefd-e4e2ce7e6a8c-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-58rxc\" (UID: \"dd570070-6aad-4b28-aefd-e4e2ce7e6a8c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.093288 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd570070-6aad-4b28-aefd-e4e2ce7e6a8c-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-58rxc\" (UID: \"dd570070-6aad-4b28-aefd-e4e2ce7e6a8c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.096155 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vc2gq" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.105205 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98rxf\" (UniqueName: \"kubernetes.io/projected/dd570070-6aad-4b28-aefd-e4e2ce7e6a8c-kube-api-access-98rxf\") pod \"nmstate-console-plugin-7754f76f8b-58rxc\" (UID: \"dd570070-6aad-4b28-aefd-e4e2ce7e6a8c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc" Jan 31 09:13:30 crc kubenswrapper[4783]: W0131 09:13:30.118578 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b90a875_3ddc_4ba4_a62a_dd83c9de4d59.slice/crio-582d5078a017a10cae3dd26b3a55bd72b79624b0e47d1bc827f7ef63904a66a6 WatchSource:0}: Error finding container 582d5078a017a10cae3dd26b3a55bd72b79624b0e47d1bc827f7ef63904a66a6: Status 404 returned error can't find the container with id 582d5078a017a10cae3dd26b3a55bd72b79624b0e47d1bc827f7ef63904a66a6 Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.175309 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.191005 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9baa6542-1dcf-4297-9607-0af68307faa0-console-serving-cert\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.191043 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9baa6542-1dcf-4297-9607-0af68307faa0-service-ca\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.191080 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9baa6542-1dcf-4297-9607-0af68307faa0-console-oauth-config\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.191103 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9baa6542-1dcf-4297-9607-0af68307faa0-trusted-ca-bundle\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.191128 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9baa6542-1dcf-4297-9607-0af68307faa0-oauth-serving-cert\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.191179 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p4vn\" (UniqueName: \"kubernetes.io/projected/9baa6542-1dcf-4297-9607-0af68307faa0-kube-api-access-2p4vn\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.191205 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9baa6542-1dcf-4297-9607-0af68307faa0-console-config\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.278490 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-kw4vm"] Jan 31 09:13:30 crc kubenswrapper[4783]: W0131 09:13:30.287511 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20903e03_98bb_4970_b9b9_9088bfbd1902.slice/crio-428c346c3a8269b203917a80bf0d08b7ea1068a1dd76deb1fc557e8bcf5828de WatchSource:0}: Error finding container 428c346c3a8269b203917a80bf0d08b7ea1068a1dd76deb1fc557e8bcf5828de: Status 404 returned error can't find the container with id 428c346c3a8269b203917a80bf0d08b7ea1068a1dd76deb1fc557e8bcf5828de Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.291821 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9baa6542-1dcf-4297-9607-0af68307faa0-oauth-serving-cert\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.291857 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p4vn\" (UniqueName: \"kubernetes.io/projected/9baa6542-1dcf-4297-9607-0af68307faa0-kube-api-access-2p4vn\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.291885 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9baa6542-1dcf-4297-9607-0af68307faa0-console-config\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.291915 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9baa6542-1dcf-4297-9607-0af68307faa0-console-serving-cert\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.291937 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9baa6542-1dcf-4297-9607-0af68307faa0-service-ca\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.291969 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9baa6542-1dcf-4297-9607-0af68307faa0-console-oauth-config\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.291992 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9baa6542-1dcf-4297-9607-0af68307faa0-trusted-ca-bundle\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.293132 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9baa6542-1dcf-4297-9607-0af68307faa0-trusted-ca-bundle\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.293348 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9baa6542-1dcf-4297-9607-0af68307faa0-oauth-serving-cert\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.293452 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9baa6542-1dcf-4297-9607-0af68307faa0-console-config\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.293897 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9baa6542-1dcf-4297-9607-0af68307faa0-service-ca\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.298481 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9baa6542-1dcf-4297-9607-0af68307faa0-console-oauth-config\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.298780 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9baa6542-1dcf-4297-9607-0af68307faa0-console-serving-cert\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.307578 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p4vn\" (UniqueName: \"kubernetes.io/projected/9baa6542-1dcf-4297-9607-0af68307faa0-kube-api-access-2p4vn\") pod \"console-5894c48794-l8mmf\" (UID: \"9baa6542-1dcf-4297-9607-0af68307faa0\") " pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.343800 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc"] Jan 31 09:13:30 crc kubenswrapper[4783]: W0131 09:13:30.346223 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd570070_6aad_4b28_aefd_e4e2ce7e6a8c.slice/crio-60cfadd289984ec2e60b4b084a7b1ecbb4df22d8d6733869799670f62b3f7657 WatchSource:0}: Error finding container 60cfadd289984ec2e60b4b084a7b1ecbb4df22d8d6733869799670f62b3f7657: Status 404 returned error can't find the container with id 60cfadd289984ec2e60b4b084a7b1ecbb4df22d8d6733869799670f62b3f7657 Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.347217 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.442507 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-2cm9k"] Jan 31 09:13:30 crc kubenswrapper[4783]: W0131 09:13:30.445104 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe6808ce_fbb6_4782_831b_892b074b7267.slice/crio-f7e961cb4af3989f804e5efe0979fd8240fd94cdba1917abdad6d8ece1015741 WatchSource:0}: Error finding container f7e961cb4af3989f804e5efe0979fd8240fd94cdba1917abdad6d8ece1015741: Status 404 returned error can't find the container with id f7e961cb4af3989f804e5efe0979fd8240fd94cdba1917abdad6d8ece1015741 Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.697063 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5894c48794-l8mmf"] Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.919327 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-kw4vm" event={"ID":"20903e03-98bb-4970-b9b9-9088bfbd1902","Type":"ContainerStarted","Data":"428c346c3a8269b203917a80bf0d08b7ea1068a1dd76deb1fc557e8bcf5828de"} Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.920396 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc" event={"ID":"dd570070-6aad-4b28-aefd-e4e2ce7e6a8c","Type":"ContainerStarted","Data":"60cfadd289984ec2e60b4b084a7b1ecbb4df22d8d6733869799670f62b3f7657"} Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.921777 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5894c48794-l8mmf" event={"ID":"9baa6542-1dcf-4297-9607-0af68307faa0","Type":"ContainerStarted","Data":"df6d58b5e6ac36514b9d3658b6bff0c5e0f2835758022acf7f6623214a27dd7d"} Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.921846 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5894c48794-l8mmf" event={"ID":"9baa6542-1dcf-4297-9607-0af68307faa0","Type":"ContainerStarted","Data":"cbd2ba7913692396b7a96a6650c55b18a913ecf9a4dd4338304f94dbe8501931"} Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.922673 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-2cm9k" event={"ID":"fe6808ce-fbb6-4782-831b-892b074b7267","Type":"ContainerStarted","Data":"f7e961cb4af3989f804e5efe0979fd8240fd94cdba1917abdad6d8ece1015741"} Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.923588 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vc2gq" event={"ID":"4b90a875-3ddc-4ba4-a62a-dd83c9de4d59","Type":"ContainerStarted","Data":"582d5078a017a10cae3dd26b3a55bd72b79624b0e47d1bc827f7ef63904a66a6"} Jan 31 09:13:30 crc kubenswrapper[4783]: I0131 09:13:30.939225 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5894c48794-l8mmf" podStartSLOduration=0.939203673 podStartE2EDuration="939.203673ms" podCreationTimestamp="2026-01-31 09:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:13:30.934048 +0000 UTC m=+521.602731468" watchObservedRunningTime="2026-01-31 09:13:30.939203673 +0000 UTC m=+521.607887140" Jan 31 09:13:33 crc kubenswrapper[4783]: I0131 09:13:33.948841 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-2cm9k" event={"ID":"fe6808ce-fbb6-4782-831b-892b074b7267","Type":"ContainerStarted","Data":"4347bf8779d93d419d00de18cc22e608c210ff12807aca43171595dd3493f94b"} Jan 31 09:13:33 crc kubenswrapper[4783]: I0131 09:13:33.949455 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-2cm9k" Jan 31 09:13:33 crc kubenswrapper[4783]: I0131 09:13:33.950649 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vc2gq" event={"ID":"4b90a875-3ddc-4ba4-a62a-dd83c9de4d59","Type":"ContainerStarted","Data":"8014b7f322891f0d7eb427a292c4b7afbda953d1c56f8ff79e16792952ffd75d"} Jan 31 09:13:33 crc kubenswrapper[4783]: I0131 09:13:33.950802 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-vc2gq" Jan 31 09:13:33 crc kubenswrapper[4783]: I0131 09:13:33.951846 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-kw4vm" event={"ID":"20903e03-98bb-4970-b9b9-9088bfbd1902","Type":"ContainerStarted","Data":"c410d585fcb9557ab83e854fee37002ef15458b80eebd071aec28c7c258a5e19"} Jan 31 09:13:33 crc kubenswrapper[4783]: I0131 09:13:33.953085 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc" event={"ID":"dd570070-6aad-4b28-aefd-e4e2ce7e6a8c","Type":"ContainerStarted","Data":"3ae81e22e403ba4533531ba514dab70383407ac95f2b899eb090741fe95ffcb3"} Jan 31 09:13:33 crc kubenswrapper[4783]: I0131 09:13:33.965123 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-2cm9k" podStartSLOduration=2.541729381 podStartE2EDuration="4.965111291s" podCreationTimestamp="2026-01-31 09:13:29 +0000 UTC" firstStartedPulling="2026-01-31 09:13:30.447076062 +0000 UTC m=+521.115759531" lastFinishedPulling="2026-01-31 09:13:32.870457973 +0000 UTC m=+523.539141441" observedRunningTime="2026-01-31 09:13:33.962562982 +0000 UTC m=+524.631246450" watchObservedRunningTime="2026-01-31 09:13:33.965111291 +0000 UTC m=+524.633794759" Jan 31 09:13:33 crc kubenswrapper[4783]: I0131 09:13:33.979379 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-58rxc" podStartSLOduration=2.460171935 podStartE2EDuration="4.979360971s" podCreationTimestamp="2026-01-31 09:13:29 +0000 UTC" firstStartedPulling="2026-01-31 09:13:30.348514981 +0000 UTC m=+521.017198449" lastFinishedPulling="2026-01-31 09:13:32.867704018 +0000 UTC m=+523.536387485" observedRunningTime="2026-01-31 09:13:33.97694927 +0000 UTC m=+524.645632738" watchObservedRunningTime="2026-01-31 09:13:33.979360971 +0000 UTC m=+524.648044439" Jan 31 09:13:33 crc kubenswrapper[4783]: I0131 09:13:33.991333 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-vc2gq" podStartSLOduration=2.23042797 podStartE2EDuration="4.991297937s" podCreationTimestamp="2026-01-31 09:13:29 +0000 UTC" firstStartedPulling="2026-01-31 09:13:30.120750834 +0000 UTC m=+520.789434303" lastFinishedPulling="2026-01-31 09:13:32.881620801 +0000 UTC m=+523.550304270" observedRunningTime="2026-01-31 09:13:33.991217295 +0000 UTC m=+524.659900773" watchObservedRunningTime="2026-01-31 09:13:33.991297937 +0000 UTC m=+524.659981405" Jan 31 09:13:34 crc kubenswrapper[4783]: I0131 09:13:34.965184 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-kw4vm" event={"ID":"20903e03-98bb-4970-b9b9-9088bfbd1902","Type":"ContainerStarted","Data":"2f170cf639134f9f5956a213efb90fb915c68e86ba0f6d363312a109bddac6ad"} Jan 31 09:13:34 crc kubenswrapper[4783]: I0131 09:13:34.982521 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-kw4vm" podStartSLOduration=1.667520508 podStartE2EDuration="5.98250635s" podCreationTimestamp="2026-01-31 09:13:29 +0000 UTC" firstStartedPulling="2026-01-31 09:13:30.290718192 +0000 UTC m=+520.959401660" lastFinishedPulling="2026-01-31 09:13:34.605704034 +0000 UTC m=+525.274387502" observedRunningTime="2026-01-31 09:13:34.979230571 +0000 UTC m=+525.647914040" watchObservedRunningTime="2026-01-31 09:13:34.98250635 +0000 UTC m=+525.651189817" Jan 31 09:13:40 crc kubenswrapper[4783]: I0131 09:13:40.114561 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-vc2gq" Jan 31 09:13:40 crc kubenswrapper[4783]: I0131 09:13:40.347879 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:40 crc kubenswrapper[4783]: I0131 09:13:40.348777 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:40 crc kubenswrapper[4783]: I0131 09:13:40.352573 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:40 crc kubenswrapper[4783]: I0131 09:13:40.997583 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5894c48794-l8mmf" Jan 31 09:13:41 crc kubenswrapper[4783]: I0131 09:13:41.040671 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-xjwbp"] Jan 31 09:13:47 crc kubenswrapper[4783]: I0131 09:13:47.757050 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:13:47 crc kubenswrapper[4783]: I0131 09:13:47.757658 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:13:50 crc kubenswrapper[4783]: I0131 09:13:50.080291 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-2cm9k" Jan 31 09:14:00 crc kubenswrapper[4783]: I0131 09:14:00.143330 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch"] Jan 31 09:14:00 crc kubenswrapper[4783]: I0131 09:14:00.145068 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" Jan 31 09:14:00 crc kubenswrapper[4783]: I0131 09:14:00.146888 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 31 09:14:00 crc kubenswrapper[4783]: I0131 09:14:00.151855 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch"] Jan 31 09:14:00 crc kubenswrapper[4783]: I0131 09:14:00.343243 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8892df4-c6f7-42b1-b003-c7d359c74690-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch\" (UID: \"a8892df4-c6f7-42b1-b003-c7d359c74690\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" Jan 31 09:14:00 crc kubenswrapper[4783]: I0131 09:14:00.343317 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8892df4-c6f7-42b1-b003-c7d359c74690-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch\" (UID: \"a8892df4-c6f7-42b1-b003-c7d359c74690\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" Jan 31 09:14:00 crc kubenswrapper[4783]: I0131 09:14:00.343345 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vdsc\" (UniqueName: \"kubernetes.io/projected/a8892df4-c6f7-42b1-b003-c7d359c74690-kube-api-access-6vdsc\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch\" (UID: \"a8892df4-c6f7-42b1-b003-c7d359c74690\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" Jan 31 09:14:00 crc kubenswrapper[4783]: I0131 09:14:00.444269 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8892df4-c6f7-42b1-b003-c7d359c74690-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch\" (UID: \"a8892df4-c6f7-42b1-b003-c7d359c74690\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" Jan 31 09:14:00 crc kubenswrapper[4783]: I0131 09:14:00.444338 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8892df4-c6f7-42b1-b003-c7d359c74690-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch\" (UID: \"a8892df4-c6f7-42b1-b003-c7d359c74690\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" Jan 31 09:14:00 crc kubenswrapper[4783]: I0131 09:14:00.444367 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vdsc\" (UniqueName: \"kubernetes.io/projected/a8892df4-c6f7-42b1-b003-c7d359c74690-kube-api-access-6vdsc\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch\" (UID: \"a8892df4-c6f7-42b1-b003-c7d359c74690\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" Jan 31 09:14:00 crc kubenswrapper[4783]: I0131 09:14:00.444887 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8892df4-c6f7-42b1-b003-c7d359c74690-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch\" (UID: \"a8892df4-c6f7-42b1-b003-c7d359c74690\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" Jan 31 09:14:00 crc kubenswrapper[4783]: I0131 09:14:00.444963 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8892df4-c6f7-42b1-b003-c7d359c74690-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch\" (UID: \"a8892df4-c6f7-42b1-b003-c7d359c74690\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" Jan 31 09:14:00 crc kubenswrapper[4783]: I0131 09:14:00.462312 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vdsc\" (UniqueName: \"kubernetes.io/projected/a8892df4-c6f7-42b1-b003-c7d359c74690-kube-api-access-6vdsc\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch\" (UID: \"a8892df4-c6f7-42b1-b003-c7d359c74690\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" Jan 31 09:14:00 crc kubenswrapper[4783]: I0131 09:14:00.760123 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" Jan 31 09:14:01 crc kubenswrapper[4783]: I0131 09:14:01.117931 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch"] Jan 31 09:14:02 crc kubenswrapper[4783]: I0131 09:14:02.096550 4783 generic.go:334] "Generic (PLEG): container finished" podID="a8892df4-c6f7-42b1-b003-c7d359c74690" containerID="8c59076c1ff691353d49f562498f68cb5c2e6af435cb8963d184d42254b240f1" exitCode=0 Jan 31 09:14:02 crc kubenswrapper[4783]: I0131 09:14:02.096664 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" event={"ID":"a8892df4-c6f7-42b1-b003-c7d359c74690","Type":"ContainerDied","Data":"8c59076c1ff691353d49f562498f68cb5c2e6af435cb8963d184d42254b240f1"} Jan 31 09:14:02 crc kubenswrapper[4783]: I0131 09:14:02.096947 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" event={"ID":"a8892df4-c6f7-42b1-b003-c7d359c74690","Type":"ContainerStarted","Data":"e9d9f71a4446ab9e8db9f9b33e4e3d2d9660ee5106465cd668571e28b36f4892"} Jan 31 09:14:04 crc kubenswrapper[4783]: I0131 09:14:04.109954 4783 generic.go:334] "Generic (PLEG): container finished" podID="a8892df4-c6f7-42b1-b003-c7d359c74690" containerID="b82805dc6ab287aa7c27b3cc4ecb41fc1a50f1b1f633b27ef429b48fd14d42f9" exitCode=0 Jan 31 09:14:04 crc kubenswrapper[4783]: I0131 09:14:04.110021 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" event={"ID":"a8892df4-c6f7-42b1-b003-c7d359c74690","Type":"ContainerDied","Data":"b82805dc6ab287aa7c27b3cc4ecb41fc1a50f1b1f633b27ef429b48fd14d42f9"} Jan 31 09:14:05 crc kubenswrapper[4783]: I0131 09:14:05.118023 4783 generic.go:334] "Generic (PLEG): container finished" podID="a8892df4-c6f7-42b1-b003-c7d359c74690" containerID="f9f6d42a6d4902889e237ad6c93db0713dc32be039a6e26d6a602e9a67c49369" exitCode=0 Jan 31 09:14:05 crc kubenswrapper[4783]: I0131 09:14:05.118060 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" event={"ID":"a8892df4-c6f7-42b1-b003-c7d359c74690","Type":"ContainerDied","Data":"f9f6d42a6d4902889e237ad6c93db0713dc32be039a6e26d6a602e9a67c49369"} Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.071110 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-xjwbp" podUID="1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266" containerName="console" containerID="cri-o://0fc98378525172db507b9435f65f533bed519862359ceea4df7777836875b495" gracePeriod=15 Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.319318 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.380150 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-xjwbp_1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266/console/0.log" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.380264 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.413528 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-oauth-config\") pod \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.413571 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8892df4-c6f7-42b1-b003-c7d359c74690-util\") pod \"a8892df4-c6f7-42b1-b003-c7d359c74690\" (UID: \"a8892df4-c6f7-42b1-b003-c7d359c74690\") " Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.413605 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8892df4-c6f7-42b1-b003-c7d359c74690-bundle\") pod \"a8892df4-c6f7-42b1-b003-c7d359c74690\" (UID: \"a8892df4-c6f7-42b1-b003-c7d359c74690\") " Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.413638 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-serving-cert\") pod \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.413662 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-service-ca\") pod \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.413699 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-trusted-ca-bundle\") pod \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.413724 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7gm7\" (UniqueName: \"kubernetes.io/projected/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-kube-api-access-c7gm7\") pod \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.413753 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-oauth-serving-cert\") pod \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.413779 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-config\") pod \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\" (UID: \"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266\") " Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.413802 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vdsc\" (UniqueName: \"kubernetes.io/projected/a8892df4-c6f7-42b1-b003-c7d359c74690-kube-api-access-6vdsc\") pod \"a8892df4-c6f7-42b1-b003-c7d359c74690\" (UID: \"a8892df4-c6f7-42b1-b003-c7d359c74690\") " Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.414516 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-service-ca" (OuterVolumeSpecName: "service-ca") pod "1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266" (UID: "1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.414588 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266" (UID: "1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.414635 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266" (UID: "1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.414905 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-config" (OuterVolumeSpecName: "console-config") pod "1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266" (UID: "1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.415178 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8892df4-c6f7-42b1-b003-c7d359c74690-bundle" (OuterVolumeSpecName: "bundle") pod "a8892df4-c6f7-42b1-b003-c7d359c74690" (UID: "a8892df4-c6f7-42b1-b003-c7d359c74690"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.420186 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8892df4-c6f7-42b1-b003-c7d359c74690-kube-api-access-6vdsc" (OuterVolumeSpecName: "kube-api-access-6vdsc") pod "a8892df4-c6f7-42b1-b003-c7d359c74690" (UID: "a8892df4-c6f7-42b1-b003-c7d359c74690"). InnerVolumeSpecName "kube-api-access-6vdsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.420789 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-kube-api-access-c7gm7" (OuterVolumeSpecName: "kube-api-access-c7gm7") pod "1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266" (UID: "1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266"). InnerVolumeSpecName "kube-api-access-c7gm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.420824 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266" (UID: "1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.420970 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266" (UID: "1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.425196 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8892df4-c6f7-42b1-b003-c7d359c74690-util" (OuterVolumeSpecName: "util") pod "a8892df4-c6f7-42b1-b003-c7d359c74690" (UID: "a8892df4-c6f7-42b1-b003-c7d359c74690"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.515729 4783 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.515776 4783 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.515788 4783 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.515800 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7gm7\" (UniqueName: \"kubernetes.io/projected/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-kube-api-access-c7gm7\") on node \"crc\" DevicePath \"\"" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.515816 4783 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.515826 4783 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.515835 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vdsc\" (UniqueName: \"kubernetes.io/projected/a8892df4-c6f7-42b1-b003-c7d359c74690-kube-api-access-6vdsc\") on node \"crc\" DevicePath \"\"" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.515845 4783 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.515855 4783 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8892df4-c6f7-42b1-b003-c7d359c74690-util\") on node \"crc\" DevicePath \"\"" Jan 31 09:14:06 crc kubenswrapper[4783]: I0131 09:14:06.515868 4783 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8892df4-c6f7-42b1-b003-c7d359c74690-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:14:07 crc kubenswrapper[4783]: I0131 09:14:07.128668 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" event={"ID":"a8892df4-c6f7-42b1-b003-c7d359c74690","Type":"ContainerDied","Data":"e9d9f71a4446ab9e8db9f9b33e4e3d2d9660ee5106465cd668571e28b36f4892"} Jan 31 09:14:07 crc kubenswrapper[4783]: I0131 09:14:07.128933 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9d9f71a4446ab9e8db9f9b33e4e3d2d9660ee5106465cd668571e28b36f4892" Jan 31 09:14:07 crc kubenswrapper[4783]: I0131 09:14:07.128691 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch" Jan 31 09:14:07 crc kubenswrapper[4783]: I0131 09:14:07.130126 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-xjwbp_1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266/console/0.log" Jan 31 09:14:07 crc kubenswrapper[4783]: I0131 09:14:07.130185 4783 generic.go:334] "Generic (PLEG): container finished" podID="1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266" containerID="0fc98378525172db507b9435f65f533bed519862359ceea4df7777836875b495" exitCode=2 Jan 31 09:14:07 crc kubenswrapper[4783]: I0131 09:14:07.130208 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xjwbp" event={"ID":"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266","Type":"ContainerDied","Data":"0fc98378525172db507b9435f65f533bed519862359ceea4df7777836875b495"} Jan 31 09:14:07 crc kubenswrapper[4783]: I0131 09:14:07.130235 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xjwbp" event={"ID":"1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266","Type":"ContainerDied","Data":"1caf0a8be7b8acab161bc4bf4ccbafaf43204c73585d52a9f1f1a9487ee8dd8b"} Jan 31 09:14:07 crc kubenswrapper[4783]: I0131 09:14:07.130260 4783 scope.go:117] "RemoveContainer" containerID="0fc98378525172db507b9435f65f533bed519862359ceea4df7777836875b495" Jan 31 09:14:07 crc kubenswrapper[4783]: I0131 09:14:07.130299 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xjwbp" Jan 31 09:14:07 crc kubenswrapper[4783]: I0131 09:14:07.148461 4783 scope.go:117] "RemoveContainer" containerID="0fc98378525172db507b9435f65f533bed519862359ceea4df7777836875b495" Jan 31 09:14:07 crc kubenswrapper[4783]: E0131 09:14:07.149143 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fc98378525172db507b9435f65f533bed519862359ceea4df7777836875b495\": container with ID starting with 0fc98378525172db507b9435f65f533bed519862359ceea4df7777836875b495 not found: ID does not exist" containerID="0fc98378525172db507b9435f65f533bed519862359ceea4df7777836875b495" Jan 31 09:14:07 crc kubenswrapper[4783]: I0131 09:14:07.149196 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fc98378525172db507b9435f65f533bed519862359ceea4df7777836875b495"} err="failed to get container status \"0fc98378525172db507b9435f65f533bed519862359ceea4df7777836875b495\": rpc error: code = NotFound desc = could not find container \"0fc98378525172db507b9435f65f533bed519862359ceea4df7777836875b495\": container with ID starting with 0fc98378525172db507b9435f65f533bed519862359ceea4df7777836875b495 not found: ID does not exist" Jan 31 09:14:07 crc kubenswrapper[4783]: I0131 09:14:07.164862 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-xjwbp"] Jan 31 09:14:07 crc kubenswrapper[4783]: I0131 09:14:07.167958 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-xjwbp"] Jan 31 09:14:07 crc kubenswrapper[4783]: I0131 09:14:07.651418 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266" path="/var/lib/kubelet/pods/1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266/volumes" Jan 31 09:14:14 crc kubenswrapper[4783]: I0131 09:14:14.937442 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw"] Jan 31 09:14:14 crc kubenswrapper[4783]: E0131 09:14:14.938129 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8892df4-c6f7-42b1-b003-c7d359c74690" containerName="util" Jan 31 09:14:14 crc kubenswrapper[4783]: I0131 09:14:14.938143 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8892df4-c6f7-42b1-b003-c7d359c74690" containerName="util" Jan 31 09:14:14 crc kubenswrapper[4783]: E0131 09:14:14.938154 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8892df4-c6f7-42b1-b003-c7d359c74690" containerName="extract" Jan 31 09:14:14 crc kubenswrapper[4783]: I0131 09:14:14.938178 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8892df4-c6f7-42b1-b003-c7d359c74690" containerName="extract" Jan 31 09:14:14 crc kubenswrapper[4783]: E0131 09:14:14.938193 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8892df4-c6f7-42b1-b003-c7d359c74690" containerName="pull" Jan 31 09:14:14 crc kubenswrapper[4783]: I0131 09:14:14.938200 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8892df4-c6f7-42b1-b003-c7d359c74690" containerName="pull" Jan 31 09:14:14 crc kubenswrapper[4783]: E0131 09:14:14.938212 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266" containerName="console" Jan 31 09:14:14 crc kubenswrapper[4783]: I0131 09:14:14.938218 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266" containerName="console" Jan 31 09:14:14 crc kubenswrapper[4783]: I0131 09:14:14.938312 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8892df4-c6f7-42b1-b003-c7d359c74690" containerName="extract" Jan 31 09:14:14 crc kubenswrapper[4783]: I0131 09:14:14.938329 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b0dabea-ff61-4a2e-9f7b-70e1d3e2c266" containerName="console" Jan 31 09:14:14 crc kubenswrapper[4783]: I0131 09:14:14.938720 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw" Jan 31 09:14:14 crc kubenswrapper[4783]: I0131 09:14:14.940053 4783 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 31 09:14:14 crc kubenswrapper[4783]: I0131 09:14:14.940899 4783 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 31 09:14:14 crc kubenswrapper[4783]: I0131 09:14:14.944520 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 31 09:14:14 crc kubenswrapper[4783]: I0131 09:14:14.945382 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 31 09:14:14 crc kubenswrapper[4783]: I0131 09:14:14.945397 4783 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7xlqj" Jan 31 09:14:14 crc kubenswrapper[4783]: I0131 09:14:14.954003 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw"] Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.032785 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db52c704-6d54-4f49-9168-903b12ed4a25-apiservice-cert\") pod \"metallb-operator-controller-manager-84cd58cb5d-hkgvw\" (UID: \"db52c704-6d54-4f49-9168-903b12ed4a25\") " pod="metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.032859 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db52c704-6d54-4f49-9168-903b12ed4a25-webhook-cert\") pod \"metallb-operator-controller-manager-84cd58cb5d-hkgvw\" (UID: \"db52c704-6d54-4f49-9168-903b12ed4a25\") " pod="metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.032902 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnl7b\" (UniqueName: \"kubernetes.io/projected/db52c704-6d54-4f49-9168-903b12ed4a25-kube-api-access-lnl7b\") pod \"metallb-operator-controller-manager-84cd58cb5d-hkgvw\" (UID: \"db52c704-6d54-4f49-9168-903b12ed4a25\") " pod="metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.133992 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db52c704-6d54-4f49-9168-903b12ed4a25-apiservice-cert\") pod \"metallb-operator-controller-manager-84cd58cb5d-hkgvw\" (UID: \"db52c704-6d54-4f49-9168-903b12ed4a25\") " pod="metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.134051 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db52c704-6d54-4f49-9168-903b12ed4a25-webhook-cert\") pod \"metallb-operator-controller-manager-84cd58cb5d-hkgvw\" (UID: \"db52c704-6d54-4f49-9168-903b12ed4a25\") " pod="metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.134089 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnl7b\" (UniqueName: \"kubernetes.io/projected/db52c704-6d54-4f49-9168-903b12ed4a25-kube-api-access-lnl7b\") pod \"metallb-operator-controller-manager-84cd58cb5d-hkgvw\" (UID: \"db52c704-6d54-4f49-9168-903b12ed4a25\") " pod="metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.142150 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db52c704-6d54-4f49-9168-903b12ed4a25-webhook-cert\") pod \"metallb-operator-controller-manager-84cd58cb5d-hkgvw\" (UID: \"db52c704-6d54-4f49-9168-903b12ed4a25\") " pod="metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.143771 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db52c704-6d54-4f49-9168-903b12ed4a25-apiservice-cert\") pod \"metallb-operator-controller-manager-84cd58cb5d-hkgvw\" (UID: \"db52c704-6d54-4f49-9168-903b12ed4a25\") " pod="metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.153839 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnl7b\" (UniqueName: \"kubernetes.io/projected/db52c704-6d54-4f49-9168-903b12ed4a25-kube-api-access-lnl7b\") pod \"metallb-operator-controller-manager-84cd58cb5d-hkgvw\" (UID: \"db52c704-6d54-4f49-9168-903b12ed4a25\") " pod="metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.253227 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.262262 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs"] Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.263299 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.265230 4783 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.265427 4783 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-zhtm9" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.268434 4783 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.278842 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs"] Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.336929 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9392ad71-70f9-4727-baaa-68ddfa6b3361-webhook-cert\") pod \"metallb-operator-webhook-server-6cbf6c4975-7nlfs\" (UID: \"9392ad71-70f9-4727-baaa-68ddfa6b3361\") " pod="metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.337121 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9392ad71-70f9-4727-baaa-68ddfa6b3361-apiservice-cert\") pod \"metallb-operator-webhook-server-6cbf6c4975-7nlfs\" (UID: \"9392ad71-70f9-4727-baaa-68ddfa6b3361\") " pod="metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.337182 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9bxl\" (UniqueName: \"kubernetes.io/projected/9392ad71-70f9-4727-baaa-68ddfa6b3361-kube-api-access-v9bxl\") pod \"metallb-operator-webhook-server-6cbf6c4975-7nlfs\" (UID: \"9392ad71-70f9-4727-baaa-68ddfa6b3361\") " pod="metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.438115 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9392ad71-70f9-4727-baaa-68ddfa6b3361-apiservice-cert\") pod \"metallb-operator-webhook-server-6cbf6c4975-7nlfs\" (UID: \"9392ad71-70f9-4727-baaa-68ddfa6b3361\") " pod="metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.438196 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9bxl\" (UniqueName: \"kubernetes.io/projected/9392ad71-70f9-4727-baaa-68ddfa6b3361-kube-api-access-v9bxl\") pod \"metallb-operator-webhook-server-6cbf6c4975-7nlfs\" (UID: \"9392ad71-70f9-4727-baaa-68ddfa6b3361\") " pod="metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.438269 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9392ad71-70f9-4727-baaa-68ddfa6b3361-webhook-cert\") pod \"metallb-operator-webhook-server-6cbf6c4975-7nlfs\" (UID: \"9392ad71-70f9-4727-baaa-68ddfa6b3361\") " pod="metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.443559 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9392ad71-70f9-4727-baaa-68ddfa6b3361-apiservice-cert\") pod \"metallb-operator-webhook-server-6cbf6c4975-7nlfs\" (UID: \"9392ad71-70f9-4727-baaa-68ddfa6b3361\") " pod="metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.451981 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9392ad71-70f9-4727-baaa-68ddfa6b3361-webhook-cert\") pod \"metallb-operator-webhook-server-6cbf6c4975-7nlfs\" (UID: \"9392ad71-70f9-4727-baaa-68ddfa6b3361\") " pod="metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.453796 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9bxl\" (UniqueName: \"kubernetes.io/projected/9392ad71-70f9-4727-baaa-68ddfa6b3361-kube-api-access-v9bxl\") pod \"metallb-operator-webhook-server-6cbf6c4975-7nlfs\" (UID: \"9392ad71-70f9-4727-baaa-68ddfa6b3361\") " pod="metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.605550 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs" Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.681670 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw"] Jan 31 09:14:15 crc kubenswrapper[4783]: I0131 09:14:15.833428 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs"] Jan 31 09:14:15 crc kubenswrapper[4783]: W0131 09:14:15.842396 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9392ad71_70f9_4727_baaa_68ddfa6b3361.slice/crio-dc6ddb9c6c9135d5fbfdb1ac7a47453b03342b1bff7710a373c8be3ba5b926db WatchSource:0}: Error finding container dc6ddb9c6c9135d5fbfdb1ac7a47453b03342b1bff7710a373c8be3ba5b926db: Status 404 returned error can't find the container with id dc6ddb9c6c9135d5fbfdb1ac7a47453b03342b1bff7710a373c8be3ba5b926db Jan 31 09:14:16 crc kubenswrapper[4783]: I0131 09:14:16.180439 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw" event={"ID":"db52c704-6d54-4f49-9168-903b12ed4a25","Type":"ContainerStarted","Data":"99072d21cf031e066c7c08d5edcab5283acc0dec350034826bae5571b11fa22a"} Jan 31 09:14:16 crc kubenswrapper[4783]: I0131 09:14:16.181646 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs" event={"ID":"9392ad71-70f9-4727-baaa-68ddfa6b3361","Type":"ContainerStarted","Data":"dc6ddb9c6c9135d5fbfdb1ac7a47453b03342b1bff7710a373c8be3ba5b926db"} Jan 31 09:14:17 crc kubenswrapper[4783]: I0131 09:14:17.756604 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:14:17 crc kubenswrapper[4783]: I0131 09:14:17.756675 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:14:17 crc kubenswrapper[4783]: I0131 09:14:17.756737 4783 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:14:17 crc kubenswrapper[4783]: I0131 09:14:17.757604 4783 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa19abb52300978825d77b73571f5c020e68f8b7df94a01ba156241b5ff00d6c"} pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:14:17 crc kubenswrapper[4783]: I0131 09:14:17.757715 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" containerID="cri-o://fa19abb52300978825d77b73571f5c020e68f8b7df94a01ba156241b5ff00d6c" gracePeriod=600 Jan 31 09:14:18 crc kubenswrapper[4783]: I0131 09:14:18.195857 4783 generic.go:334] "Generic (PLEG): container finished" podID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerID="fa19abb52300978825d77b73571f5c020e68f8b7df94a01ba156241b5ff00d6c" exitCode=0 Jan 31 09:14:18 crc kubenswrapper[4783]: I0131 09:14:18.195954 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerDied","Data":"fa19abb52300978825d77b73571f5c020e68f8b7df94a01ba156241b5ff00d6c"} Jan 31 09:14:18 crc kubenswrapper[4783]: I0131 09:14:18.196106 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerStarted","Data":"63dec065c0e2ce55bb88687151f12c6eb92203eb247bb4dce8e626a9b6254663"} Jan 31 09:14:18 crc kubenswrapper[4783]: I0131 09:14:18.196127 4783 scope.go:117] "RemoveContainer" containerID="eb0c7fd7fa4ed1c1e3f1dc52fb6d93f057aa5a1f9ffa937b84cf1761c03b046a" Jan 31 09:14:21 crc kubenswrapper[4783]: I0131 09:14:21.219010 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw" event={"ID":"db52c704-6d54-4f49-9168-903b12ed4a25","Type":"ContainerStarted","Data":"3750580cb3d25b6bbd8d8d8704988688f4d9e19984cc74b9df6f7552d5bf0773"} Jan 31 09:14:21 crc kubenswrapper[4783]: I0131 09:14:21.220624 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw" Jan 31 09:14:21 crc kubenswrapper[4783]: I0131 09:14:21.221605 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs" event={"ID":"9392ad71-70f9-4727-baaa-68ddfa6b3361","Type":"ContainerStarted","Data":"2ad4085b4dd86dae18edf88154ed6d920fbe23a93fbae089bdd6c8d89d77b6ec"} Jan 31 09:14:21 crc kubenswrapper[4783]: I0131 09:14:21.221745 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs" Jan 31 09:14:21 crc kubenswrapper[4783]: I0131 09:14:21.259594 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw" podStartSLOduration=2.235486611 podStartE2EDuration="7.25957528s" podCreationTimestamp="2026-01-31 09:14:14 +0000 UTC" firstStartedPulling="2026-01-31 09:14:15.697797816 +0000 UTC m=+566.366481283" lastFinishedPulling="2026-01-31 09:14:20.721886484 +0000 UTC m=+571.390569952" observedRunningTime="2026-01-31 09:14:21.255149026 +0000 UTC m=+571.923832484" watchObservedRunningTime="2026-01-31 09:14:21.25957528 +0000 UTC m=+571.928258748" Jan 31 09:14:21 crc kubenswrapper[4783]: I0131 09:14:21.273119 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs" podStartSLOduration=1.39303052 podStartE2EDuration="6.273104054s" podCreationTimestamp="2026-01-31 09:14:15 +0000 UTC" firstStartedPulling="2026-01-31 09:14:15.845847486 +0000 UTC m=+566.514530954" lastFinishedPulling="2026-01-31 09:14:20.72592102 +0000 UTC m=+571.394604488" observedRunningTime="2026-01-31 09:14:21.270410492 +0000 UTC m=+571.939093960" watchObservedRunningTime="2026-01-31 09:14:21.273104054 +0000 UTC m=+571.941787522" Jan 31 09:14:35 crc kubenswrapper[4783]: I0131 09:14:35.612650 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6cbf6c4975-7nlfs" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.256088 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-84cd58cb5d-hkgvw" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.831403 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-x77mn"] Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.833439 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.833989 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-2q4pc"] Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.834617 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2q4pc" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.835859 4783 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.836134 4783 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.836465 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.836595 4783 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-82f29" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.846192 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-2q4pc"] Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.894663 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-hn6z7"] Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.895640 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-hn6z7" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.897231 4783 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.901468 4783 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.901696 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.901834 4783 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-4xfnd" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.908820 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-qcxmw"] Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.909642 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-qcxmw" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.916418 4783 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.921223 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-qcxmw"] Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.954074 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-metrics\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.954199 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gkxb\" (UniqueName: \"kubernetes.io/projected/1e56e9fc-1576-4315-97b2-fa45c03bb8ca-kube-api-access-7gkxb\") pod \"frr-k8s-webhook-server-7df86c4f6c-2q4pc\" (UID: \"1e56e9fc-1576-4315-97b2-fa45c03bb8ca\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2q4pc" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.954242 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-metrics-certs\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.954276 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z5x6\" (UniqueName: \"kubernetes.io/projected/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-kube-api-access-9z5x6\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.954306 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-frr-startup\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.954342 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-frr-conf\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.954413 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-frr-sockets\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.954555 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-reloader\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:55 crc kubenswrapper[4783]: I0131 09:14:55.955063 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1e56e9fc-1576-4315-97b2-fa45c03bb8ca-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-2q4pc\" (UID: \"1e56e9fc-1576-4315-97b2-fa45c03bb8ca\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2q4pc" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.056765 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1e56e9fc-1576-4315-97b2-fa45c03bb8ca-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-2q4pc\" (UID: \"1e56e9fc-1576-4315-97b2-fa45c03bb8ca\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2q4pc" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.057118 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-metrics\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.057154 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-memberlist\") pod \"speaker-hn6z7\" (UID: \"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44\") " pod="metallb-system/speaker-hn6z7" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.057188 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90502cfa-f884-4181-b14c-98b49f254530-metrics-certs\") pod \"controller-6968d8fdc4-qcxmw\" (UID: \"90502cfa-f884-4181-b14c-98b49f254530\") " pod="metallb-system/controller-6968d8fdc4-qcxmw" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.057223 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gkxb\" (UniqueName: \"kubernetes.io/projected/1e56e9fc-1576-4315-97b2-fa45c03bb8ca-kube-api-access-7gkxb\") pod \"frr-k8s-webhook-server-7df86c4f6c-2q4pc\" (UID: \"1e56e9fc-1576-4315-97b2-fa45c03bb8ca\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2q4pc" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.057259 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p28l4\" (UniqueName: \"kubernetes.io/projected/90502cfa-f884-4181-b14c-98b49f254530-kube-api-access-p28l4\") pod \"controller-6968d8fdc4-qcxmw\" (UID: \"90502cfa-f884-4181-b14c-98b49f254530\") " pod="metallb-system/controller-6968d8fdc4-qcxmw" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.057285 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-metrics-certs\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.057308 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-metrics-certs\") pod \"speaker-hn6z7\" (UID: \"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44\") " pod="metallb-system/speaker-hn6z7" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.057333 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z5x6\" (UniqueName: \"kubernetes.io/projected/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-kube-api-access-9z5x6\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.057355 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-frr-startup\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.057381 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkbdr\" (UniqueName: \"kubernetes.io/projected/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-kube-api-access-lkbdr\") pod \"speaker-hn6z7\" (UID: \"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44\") " pod="metallb-system/speaker-hn6z7" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.057404 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-frr-conf\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.057425 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-metallb-excludel2\") pod \"speaker-hn6z7\" (UID: \"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44\") " pod="metallb-system/speaker-hn6z7" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.057477 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-frr-sockets\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.057496 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90502cfa-f884-4181-b14c-98b49f254530-cert\") pod \"controller-6968d8fdc4-qcxmw\" (UID: \"90502cfa-f884-4181-b14c-98b49f254530\") " pod="metallb-system/controller-6968d8fdc4-qcxmw" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.057514 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-reloader\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.057856 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-reloader\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.060175 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-metrics\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.060287 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-frr-conf\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.060485 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-frr-sockets\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.066912 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-frr-startup\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.083013 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-metrics-certs\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.083894 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1e56e9fc-1576-4315-97b2-fa45c03bb8ca-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-2q4pc\" (UID: \"1e56e9fc-1576-4315-97b2-fa45c03bb8ca\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2q4pc" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.086379 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z5x6\" (UniqueName: \"kubernetes.io/projected/f71dbce1-2082-4ee0-8b6b-21fdf4313b06-kube-api-access-9z5x6\") pod \"frr-k8s-x77mn\" (UID: \"f71dbce1-2082-4ee0-8b6b-21fdf4313b06\") " pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.086533 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gkxb\" (UniqueName: \"kubernetes.io/projected/1e56e9fc-1576-4315-97b2-fa45c03bb8ca-kube-api-access-7gkxb\") pod \"frr-k8s-webhook-server-7df86c4f6c-2q4pc\" (UID: \"1e56e9fc-1576-4315-97b2-fa45c03bb8ca\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2q4pc" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.148562 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-x77mn" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.151933 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2q4pc" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.158862 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-memberlist\") pod \"speaker-hn6z7\" (UID: \"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44\") " pod="metallb-system/speaker-hn6z7" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.158908 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90502cfa-f884-4181-b14c-98b49f254530-metrics-certs\") pod \"controller-6968d8fdc4-qcxmw\" (UID: \"90502cfa-f884-4181-b14c-98b49f254530\") " pod="metallb-system/controller-6968d8fdc4-qcxmw" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.159066 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p28l4\" (UniqueName: \"kubernetes.io/projected/90502cfa-f884-4181-b14c-98b49f254530-kube-api-access-p28l4\") pod \"controller-6968d8fdc4-qcxmw\" (UID: \"90502cfa-f884-4181-b14c-98b49f254530\") " pod="metallb-system/controller-6968d8fdc4-qcxmw" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.159099 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-metrics-certs\") pod \"speaker-hn6z7\" (UID: \"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44\") " pod="metallb-system/speaker-hn6z7" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.159130 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkbdr\" (UniqueName: \"kubernetes.io/projected/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-kube-api-access-lkbdr\") pod \"speaker-hn6z7\" (UID: \"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44\") " pod="metallb-system/speaker-hn6z7" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.159157 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-metallb-excludel2\") pod \"speaker-hn6z7\" (UID: \"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44\") " pod="metallb-system/speaker-hn6z7" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.159203 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90502cfa-f884-4181-b14c-98b49f254530-cert\") pod \"controller-6968d8fdc4-qcxmw\" (UID: \"90502cfa-f884-4181-b14c-98b49f254530\") " pod="metallb-system/controller-6968d8fdc4-qcxmw" Jan 31 09:14:56 crc kubenswrapper[4783]: E0131 09:14:56.159397 4783 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 31 09:14:56 crc kubenswrapper[4783]: E0131 09:14:56.159466 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-memberlist podName:9d4ef83f-da80-4f86-8e3f-6618d9bd5c44 nodeName:}" failed. No retries permitted until 2026-01-31 09:14:56.659444961 +0000 UTC m=+607.328128428 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-memberlist") pod "speaker-hn6z7" (UID: "9d4ef83f-da80-4f86-8e3f-6618d9bd5c44") : secret "metallb-memberlist" not found Jan 31 09:14:56 crc kubenswrapper[4783]: E0131 09:14:56.159761 4783 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 31 09:14:56 crc kubenswrapper[4783]: E0131 09:14:56.159793 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90502cfa-f884-4181-b14c-98b49f254530-metrics-certs podName:90502cfa-f884-4181-b14c-98b49f254530 nodeName:}" failed. No retries permitted until 2026-01-31 09:14:56.659785723 +0000 UTC m=+607.328469191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90502cfa-f884-4181-b14c-98b49f254530-metrics-certs") pod "controller-6968d8fdc4-qcxmw" (UID: "90502cfa-f884-4181-b14c-98b49f254530") : secret "controller-certs-secret" not found Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.162937 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-metallb-excludel2\") pod \"speaker-hn6z7\" (UID: \"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44\") " pod="metallb-system/speaker-hn6z7" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.163551 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-metrics-certs\") pod \"speaker-hn6z7\" (UID: \"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44\") " pod="metallb-system/speaker-hn6z7" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.165893 4783 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.174174 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p28l4\" (UniqueName: \"kubernetes.io/projected/90502cfa-f884-4181-b14c-98b49f254530-kube-api-access-p28l4\") pod \"controller-6968d8fdc4-qcxmw\" (UID: \"90502cfa-f884-4181-b14c-98b49f254530\") " pod="metallb-system/controller-6968d8fdc4-qcxmw" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.174896 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90502cfa-f884-4181-b14c-98b49f254530-cert\") pod \"controller-6968d8fdc4-qcxmw\" (UID: \"90502cfa-f884-4181-b14c-98b49f254530\") " pod="metallb-system/controller-6968d8fdc4-qcxmw" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.175835 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkbdr\" (UniqueName: \"kubernetes.io/projected/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-kube-api-access-lkbdr\") pod \"speaker-hn6z7\" (UID: \"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44\") " pod="metallb-system/speaker-hn6z7" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.406155 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x77mn" event={"ID":"f71dbce1-2082-4ee0-8b6b-21fdf4313b06","Type":"ContainerStarted","Data":"acfad3848ee297d794f5a21190be1fd0788d2e2b227e2b9a8c662e07ef62f25c"} Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.520451 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-2q4pc"] Jan 31 09:14:56 crc kubenswrapper[4783]: W0131 09:14:56.527407 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e56e9fc_1576_4315_97b2_fa45c03bb8ca.slice/crio-570a80c29194d2b2fe0ca932218638bcc640d8a6a6eca0ed3e4cf74fb7673204 WatchSource:0}: Error finding container 570a80c29194d2b2fe0ca932218638bcc640d8a6a6eca0ed3e4cf74fb7673204: Status 404 returned error can't find the container with id 570a80c29194d2b2fe0ca932218638bcc640d8a6a6eca0ed3e4cf74fb7673204 Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.666249 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-memberlist\") pod \"speaker-hn6z7\" (UID: \"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44\") " pod="metallb-system/speaker-hn6z7" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.666309 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90502cfa-f884-4181-b14c-98b49f254530-metrics-certs\") pod \"controller-6968d8fdc4-qcxmw\" (UID: \"90502cfa-f884-4181-b14c-98b49f254530\") " pod="metallb-system/controller-6968d8fdc4-qcxmw" Jan 31 09:14:56 crc kubenswrapper[4783]: E0131 09:14:56.666461 4783 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 31 09:14:56 crc kubenswrapper[4783]: E0131 09:14:56.666549 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-memberlist podName:9d4ef83f-da80-4f86-8e3f-6618d9bd5c44 nodeName:}" failed. No retries permitted until 2026-01-31 09:14:57.666529417 +0000 UTC m=+608.335212885 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-memberlist") pod "speaker-hn6z7" (UID: "9d4ef83f-da80-4f86-8e3f-6618d9bd5c44") : secret "metallb-memberlist" not found Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.671579 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90502cfa-f884-4181-b14c-98b49f254530-metrics-certs\") pod \"controller-6968d8fdc4-qcxmw\" (UID: \"90502cfa-f884-4181-b14c-98b49f254530\") " pod="metallb-system/controller-6968d8fdc4-qcxmw" Jan 31 09:14:56 crc kubenswrapper[4783]: I0131 09:14:56.825495 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-qcxmw" Jan 31 09:14:57 crc kubenswrapper[4783]: I0131 09:14:57.221953 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-qcxmw"] Jan 31 09:14:57 crc kubenswrapper[4783]: W0131 09:14:57.229438 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90502cfa_f884_4181_b14c_98b49f254530.slice/crio-205b99c81fbe49fb230cc939235e703805535e0671cba188c32613b9306b82a1 WatchSource:0}: Error finding container 205b99c81fbe49fb230cc939235e703805535e0671cba188c32613b9306b82a1: Status 404 returned error can't find the container with id 205b99c81fbe49fb230cc939235e703805535e0671cba188c32613b9306b82a1 Jan 31 09:14:57 crc kubenswrapper[4783]: I0131 09:14:57.410773 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-qcxmw" event={"ID":"90502cfa-f884-4181-b14c-98b49f254530","Type":"ContainerStarted","Data":"c5a6d3bc6aaed0e5e18a2a96a2aa485ac98aae1228af1a102e35ea4c66d290b7"} Jan 31 09:14:57 crc kubenswrapper[4783]: I0131 09:14:57.411051 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-qcxmw" event={"ID":"90502cfa-f884-4181-b14c-98b49f254530","Type":"ContainerStarted","Data":"205b99c81fbe49fb230cc939235e703805535e0671cba188c32613b9306b82a1"} Jan 31 09:14:57 crc kubenswrapper[4783]: I0131 09:14:57.411707 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2q4pc" event={"ID":"1e56e9fc-1576-4315-97b2-fa45c03bb8ca","Type":"ContainerStarted","Data":"570a80c29194d2b2fe0ca932218638bcc640d8a6a6eca0ed3e4cf74fb7673204"} Jan 31 09:14:57 crc kubenswrapper[4783]: I0131 09:14:57.680528 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-memberlist\") pod \"speaker-hn6z7\" (UID: \"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44\") " pod="metallb-system/speaker-hn6z7" Jan 31 09:14:57 crc kubenswrapper[4783]: I0131 09:14:57.687252 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9d4ef83f-da80-4f86-8e3f-6618d9bd5c44-memberlist\") pod \"speaker-hn6z7\" (UID: \"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44\") " pod="metallb-system/speaker-hn6z7" Jan 31 09:14:57 crc kubenswrapper[4783]: I0131 09:14:57.717387 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-hn6z7" Jan 31 09:14:57 crc kubenswrapper[4783]: W0131 09:14:57.737369 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d4ef83f_da80_4f86_8e3f_6618d9bd5c44.slice/crio-1a622fa8aa5566de8ced38effc93d48ba94d9dba42ae571617a02aed5ac26c13 WatchSource:0}: Error finding container 1a622fa8aa5566de8ced38effc93d48ba94d9dba42ae571617a02aed5ac26c13: Status 404 returned error can't find the container with id 1a622fa8aa5566de8ced38effc93d48ba94d9dba42ae571617a02aed5ac26c13 Jan 31 09:14:58 crc kubenswrapper[4783]: I0131 09:14:58.420049 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-qcxmw" event={"ID":"90502cfa-f884-4181-b14c-98b49f254530","Type":"ContainerStarted","Data":"865b46a8f879534adcda5b6aada0ed075dd86a829e0d65a4e4e50bfcd6bd4aef"} Jan 31 09:14:58 crc kubenswrapper[4783]: I0131 09:14:58.420379 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-qcxmw" Jan 31 09:14:58 crc kubenswrapper[4783]: I0131 09:14:58.423133 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hn6z7" event={"ID":"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44","Type":"ContainerStarted","Data":"993ca842ffd43e6105a78c12838bb0b34d3078ce413cff290f2c07bcc889405b"} Jan 31 09:14:58 crc kubenswrapper[4783]: I0131 09:14:58.423192 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hn6z7" event={"ID":"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44","Type":"ContainerStarted","Data":"1aeda5e8fc040dfbbf8175b883df080903b2d7e2b0cf72141bb4d1fc5a5da9fc"} Jan 31 09:14:58 crc kubenswrapper[4783]: I0131 09:14:58.423207 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hn6z7" event={"ID":"9d4ef83f-da80-4f86-8e3f-6618d9bd5c44","Type":"ContainerStarted","Data":"1a622fa8aa5566de8ced38effc93d48ba94d9dba42ae571617a02aed5ac26c13"} Jan 31 09:14:58 crc kubenswrapper[4783]: I0131 09:14:58.423373 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-hn6z7" Jan 31 09:14:58 crc kubenswrapper[4783]: I0131 09:14:58.437675 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-qcxmw" podStartSLOduration=3.437659122 podStartE2EDuration="3.437659122s" podCreationTimestamp="2026-01-31 09:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:14:58.43390547 +0000 UTC m=+609.102588939" watchObservedRunningTime="2026-01-31 09:14:58.437659122 +0000 UTC m=+609.106342589" Jan 31 09:14:58 crc kubenswrapper[4783]: I0131 09:14:58.448425 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-hn6z7" podStartSLOduration=3.448417567 podStartE2EDuration="3.448417567s" podCreationTimestamp="2026-01-31 09:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:14:58.446356013 +0000 UTC m=+609.115039481" watchObservedRunningTime="2026-01-31 09:14:58.448417567 +0000 UTC m=+609.117101025" Jan 31 09:15:00 crc kubenswrapper[4783]: I0131 09:15:00.140260 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd"] Jan 31 09:15:00 crc kubenswrapper[4783]: I0131 09:15:00.140913 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd" Jan 31 09:15:00 crc kubenswrapper[4783]: I0131 09:15:00.145665 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 09:15:00 crc kubenswrapper[4783]: I0131 09:15:00.146410 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 09:15:00 crc kubenswrapper[4783]: I0131 09:15:00.151770 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd"] Jan 31 09:15:00 crc kubenswrapper[4783]: I0131 09:15:00.220148 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc0f6d8f-df21-48f3-817a-42e0363934cb-config-volume\") pod \"collect-profiles-29497515-dbtqd\" (UID: \"cc0f6d8f-df21-48f3-817a-42e0363934cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd" Jan 31 09:15:00 crc kubenswrapper[4783]: I0131 09:15:00.220289 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc0f6d8f-df21-48f3-817a-42e0363934cb-secret-volume\") pod \"collect-profiles-29497515-dbtqd\" (UID: \"cc0f6d8f-df21-48f3-817a-42e0363934cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd" Jan 31 09:15:00 crc kubenswrapper[4783]: I0131 09:15:00.220392 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctznh\" (UniqueName: \"kubernetes.io/projected/cc0f6d8f-df21-48f3-817a-42e0363934cb-kube-api-access-ctznh\") pod \"collect-profiles-29497515-dbtqd\" (UID: \"cc0f6d8f-df21-48f3-817a-42e0363934cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd" Jan 31 09:15:00 crc kubenswrapper[4783]: I0131 09:15:00.321713 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc0f6d8f-df21-48f3-817a-42e0363934cb-config-volume\") pod \"collect-profiles-29497515-dbtqd\" (UID: \"cc0f6d8f-df21-48f3-817a-42e0363934cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd" Jan 31 09:15:00 crc kubenswrapper[4783]: I0131 09:15:00.321774 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc0f6d8f-df21-48f3-817a-42e0363934cb-secret-volume\") pod \"collect-profiles-29497515-dbtqd\" (UID: \"cc0f6d8f-df21-48f3-817a-42e0363934cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd" Jan 31 09:15:00 crc kubenswrapper[4783]: I0131 09:15:00.321826 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctznh\" (UniqueName: \"kubernetes.io/projected/cc0f6d8f-df21-48f3-817a-42e0363934cb-kube-api-access-ctznh\") pod \"collect-profiles-29497515-dbtqd\" (UID: \"cc0f6d8f-df21-48f3-817a-42e0363934cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd" Jan 31 09:15:00 crc kubenswrapper[4783]: I0131 09:15:00.322920 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc0f6d8f-df21-48f3-817a-42e0363934cb-config-volume\") pod \"collect-profiles-29497515-dbtqd\" (UID: \"cc0f6d8f-df21-48f3-817a-42e0363934cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd" Jan 31 09:15:00 crc kubenswrapper[4783]: I0131 09:15:00.328567 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc0f6d8f-df21-48f3-817a-42e0363934cb-secret-volume\") pod \"collect-profiles-29497515-dbtqd\" (UID: \"cc0f6d8f-df21-48f3-817a-42e0363934cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd" Jan 31 09:15:00 crc kubenswrapper[4783]: I0131 09:15:00.336208 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctznh\" (UniqueName: \"kubernetes.io/projected/cc0f6d8f-df21-48f3-817a-42e0363934cb-kube-api-access-ctznh\") pod \"collect-profiles-29497515-dbtqd\" (UID: \"cc0f6d8f-df21-48f3-817a-42e0363934cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd" Jan 31 09:15:00 crc kubenswrapper[4783]: I0131 09:15:00.453397 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd" Jan 31 09:15:00 crc kubenswrapper[4783]: I0131 09:15:00.879608 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd"] Jan 31 09:15:00 crc kubenswrapper[4783]: W0131 09:15:00.891083 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc0f6d8f_df21_48f3_817a_42e0363934cb.slice/crio-ad77d9bdbca874ba80ac9bf6c0e8f77f2e87eb09e39fb1ecae6811fbe50984ad WatchSource:0}: Error finding container ad77d9bdbca874ba80ac9bf6c0e8f77f2e87eb09e39fb1ecae6811fbe50984ad: Status 404 returned error can't find the container with id ad77d9bdbca874ba80ac9bf6c0e8f77f2e87eb09e39fb1ecae6811fbe50984ad Jan 31 09:15:01 crc kubenswrapper[4783]: I0131 09:15:01.448077 4783 generic.go:334] "Generic (PLEG): container finished" podID="cc0f6d8f-df21-48f3-817a-42e0363934cb" containerID="54749afc2bd331aaecae5feba5355cdd60b03efd7a49531d827f62dcee5c84ce" exitCode=0 Jan 31 09:15:01 crc kubenswrapper[4783]: I0131 09:15:01.448124 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd" event={"ID":"cc0f6d8f-df21-48f3-817a-42e0363934cb","Type":"ContainerDied","Data":"54749afc2bd331aaecae5feba5355cdd60b03efd7a49531d827f62dcee5c84ce"} Jan 31 09:15:01 crc kubenswrapper[4783]: I0131 09:15:01.448176 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd" event={"ID":"cc0f6d8f-df21-48f3-817a-42e0363934cb","Type":"ContainerStarted","Data":"ad77d9bdbca874ba80ac9bf6c0e8f77f2e87eb09e39fb1ecae6811fbe50984ad"} Jan 31 09:15:04 crc kubenswrapper[4783]: I0131 09:15:04.574728 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd" Jan 31 09:15:04 crc kubenswrapper[4783]: I0131 09:15:04.684056 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc0f6d8f-df21-48f3-817a-42e0363934cb-config-volume\") pod \"cc0f6d8f-df21-48f3-817a-42e0363934cb\" (UID: \"cc0f6d8f-df21-48f3-817a-42e0363934cb\") " Jan 31 09:15:04 crc kubenswrapper[4783]: I0131 09:15:04.684139 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc0f6d8f-df21-48f3-817a-42e0363934cb-secret-volume\") pod \"cc0f6d8f-df21-48f3-817a-42e0363934cb\" (UID: \"cc0f6d8f-df21-48f3-817a-42e0363934cb\") " Jan 31 09:15:04 crc kubenswrapper[4783]: I0131 09:15:04.684772 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc0f6d8f-df21-48f3-817a-42e0363934cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "cc0f6d8f-df21-48f3-817a-42e0363934cb" (UID: "cc0f6d8f-df21-48f3-817a-42e0363934cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:15:04 crc kubenswrapper[4783]: I0131 09:15:04.685010 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctznh\" (UniqueName: \"kubernetes.io/projected/cc0f6d8f-df21-48f3-817a-42e0363934cb-kube-api-access-ctznh\") pod \"cc0f6d8f-df21-48f3-817a-42e0363934cb\" (UID: \"cc0f6d8f-df21-48f3-817a-42e0363934cb\") " Jan 31 09:15:04 crc kubenswrapper[4783]: I0131 09:15:04.685525 4783 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc0f6d8f-df21-48f3-817a-42e0363934cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:04 crc kubenswrapper[4783]: I0131 09:15:04.691523 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc0f6d8f-df21-48f3-817a-42e0363934cb-kube-api-access-ctznh" (OuterVolumeSpecName: "kube-api-access-ctznh") pod "cc0f6d8f-df21-48f3-817a-42e0363934cb" (UID: "cc0f6d8f-df21-48f3-817a-42e0363934cb"). InnerVolumeSpecName "kube-api-access-ctznh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:15:04 crc kubenswrapper[4783]: I0131 09:15:04.692327 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc0f6d8f-df21-48f3-817a-42e0363934cb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cc0f6d8f-df21-48f3-817a-42e0363934cb" (UID: "cc0f6d8f-df21-48f3-817a-42e0363934cb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:15:04 crc kubenswrapper[4783]: I0131 09:15:04.787189 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctznh\" (UniqueName: \"kubernetes.io/projected/cc0f6d8f-df21-48f3-817a-42e0363934cb-kube-api-access-ctznh\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:04 crc kubenswrapper[4783]: I0131 09:15:04.787223 4783 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc0f6d8f-df21-48f3-817a-42e0363934cb-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:05 crc kubenswrapper[4783]: I0131 09:15:05.482870 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2q4pc" event={"ID":"1e56e9fc-1576-4315-97b2-fa45c03bb8ca","Type":"ContainerStarted","Data":"765870273fe7ad20416c007af7ed96dde49b662cae7bb265cf6cd5e2f37795ca"} Jan 31 09:15:05 crc kubenswrapper[4783]: I0131 09:15:05.483323 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2q4pc" Jan 31 09:15:05 crc kubenswrapper[4783]: I0131 09:15:05.484809 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd" Jan 31 09:15:05 crc kubenswrapper[4783]: I0131 09:15:05.484827 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd" event={"ID":"cc0f6d8f-df21-48f3-817a-42e0363934cb","Type":"ContainerDied","Data":"ad77d9bdbca874ba80ac9bf6c0e8f77f2e87eb09e39fb1ecae6811fbe50984ad"} Jan 31 09:15:05 crc kubenswrapper[4783]: I0131 09:15:05.485247 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad77d9bdbca874ba80ac9bf6c0e8f77f2e87eb09e39fb1ecae6811fbe50984ad" Jan 31 09:15:05 crc kubenswrapper[4783]: I0131 09:15:05.487345 4783 generic.go:334] "Generic (PLEG): container finished" podID="f71dbce1-2082-4ee0-8b6b-21fdf4313b06" containerID="2affd01d37e47ea0c39d5b4acb7d5ba34fb0f2a1319ce58533896d155407a9ff" exitCode=0 Jan 31 09:15:05 crc kubenswrapper[4783]: I0131 09:15:05.487385 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x77mn" event={"ID":"f71dbce1-2082-4ee0-8b6b-21fdf4313b06","Type":"ContainerDied","Data":"2affd01d37e47ea0c39d5b4acb7d5ba34fb0f2a1319ce58533896d155407a9ff"} Jan 31 09:15:05 crc kubenswrapper[4783]: I0131 09:15:05.509214 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2q4pc" podStartSLOduration=2.449193728 podStartE2EDuration="10.509194523s" podCreationTimestamp="2026-01-31 09:14:55 +0000 UTC" firstStartedPulling="2026-01-31 09:14:56.530962035 +0000 UTC m=+607.199645493" lastFinishedPulling="2026-01-31 09:15:04.59096282 +0000 UTC m=+615.259646288" observedRunningTime="2026-01-31 09:15:05.501237695 +0000 UTC m=+616.169921162" watchObservedRunningTime="2026-01-31 09:15:05.509194523 +0000 UTC m=+616.177877990" Jan 31 09:15:06 crc kubenswrapper[4783]: I0131 09:15:06.498145 4783 generic.go:334] "Generic (PLEG): container finished" podID="f71dbce1-2082-4ee0-8b6b-21fdf4313b06" containerID="6c8b76cb29df0881bd4d85dc1e2628b3e41214b51a1a0e146471e9eb51543fd4" exitCode=0 Jan 31 09:15:06 crc kubenswrapper[4783]: I0131 09:15:06.498228 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x77mn" event={"ID":"f71dbce1-2082-4ee0-8b6b-21fdf4313b06","Type":"ContainerDied","Data":"6c8b76cb29df0881bd4d85dc1e2628b3e41214b51a1a0e146471e9eb51543fd4"} Jan 31 09:15:07 crc kubenswrapper[4783]: I0131 09:15:07.504963 4783 generic.go:334] "Generic (PLEG): container finished" podID="f71dbce1-2082-4ee0-8b6b-21fdf4313b06" containerID="4196d8116cc713aac9a4a2f8bca2caa96ca46b0d474375ce3d50e155d227cc0f" exitCode=0 Jan 31 09:15:07 crc kubenswrapper[4783]: I0131 09:15:07.505031 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x77mn" event={"ID":"f71dbce1-2082-4ee0-8b6b-21fdf4313b06","Type":"ContainerDied","Data":"4196d8116cc713aac9a4a2f8bca2caa96ca46b0d474375ce3d50e155d227cc0f"} Jan 31 09:15:07 crc kubenswrapper[4783]: I0131 09:15:07.723212 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-hn6z7" Jan 31 09:15:08 crc kubenswrapper[4783]: I0131 09:15:08.516698 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x77mn" event={"ID":"f71dbce1-2082-4ee0-8b6b-21fdf4313b06","Type":"ContainerStarted","Data":"5e8471af4d0c4a72b86f9ac094962cdab205f0ee7f7bffedca401a3076505463"} Jan 31 09:15:08 crc kubenswrapper[4783]: I0131 09:15:08.516765 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x77mn" event={"ID":"f71dbce1-2082-4ee0-8b6b-21fdf4313b06","Type":"ContainerStarted","Data":"d4f41d416f70e194f61d8380dbee54510453c4baa3dbc272e5ef4eda889d2def"} Jan 31 09:15:08 crc kubenswrapper[4783]: I0131 09:15:08.516778 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x77mn" event={"ID":"f71dbce1-2082-4ee0-8b6b-21fdf4313b06","Type":"ContainerStarted","Data":"52db4ceb2f577cc77a7244caa9e8e100c6f71246972fee329369e27dbcb396d4"} Jan 31 09:15:08 crc kubenswrapper[4783]: I0131 09:15:08.516789 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x77mn" event={"ID":"f71dbce1-2082-4ee0-8b6b-21fdf4313b06","Type":"ContainerStarted","Data":"64b85343e1c61fc021e1d891fe2c5dcef5170b86c89fc4107f355dad1c3d6e67"} Jan 31 09:15:08 crc kubenswrapper[4783]: I0131 09:15:08.516799 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x77mn" event={"ID":"f71dbce1-2082-4ee0-8b6b-21fdf4313b06","Type":"ContainerStarted","Data":"5406887e5337feaac77f492098c9fd7507566cf6d8a83e1ed2d3f23b7c72b760"} Jan 31 09:15:08 crc kubenswrapper[4783]: I0131 09:15:08.516809 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x77mn" event={"ID":"f71dbce1-2082-4ee0-8b6b-21fdf4313b06","Type":"ContainerStarted","Data":"67d7c06fb8695ae9a11d0cf7c33d232cc9d3fa0fc0817376503a9053dc3ade40"} Jan 31 09:15:08 crc kubenswrapper[4783]: I0131 09:15:08.516912 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-x77mn" Jan 31 09:15:08 crc kubenswrapper[4783]: I0131 09:15:08.535936 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-x77mn" podStartSLOduration=5.224172634 podStartE2EDuration="13.53590396s" podCreationTimestamp="2026-01-31 09:14:55 +0000 UTC" firstStartedPulling="2026-01-31 09:14:56.283332571 +0000 UTC m=+606.952016038" lastFinishedPulling="2026-01-31 09:15:04.595063895 +0000 UTC m=+615.263747364" observedRunningTime="2026-01-31 09:15:08.532939195 +0000 UTC m=+619.201622663" watchObservedRunningTime="2026-01-31 09:15:08.53590396 +0000 UTC m=+619.204587427" Jan 31 09:15:09 crc kubenswrapper[4783]: I0131 09:15:09.904076 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ppxt9"] Jan 31 09:15:09 crc kubenswrapper[4783]: E0131 09:15:09.904935 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0f6d8f-df21-48f3-817a-42e0363934cb" containerName="collect-profiles" Jan 31 09:15:09 crc kubenswrapper[4783]: I0131 09:15:09.904951 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0f6d8f-df21-48f3-817a-42e0363934cb" containerName="collect-profiles" Jan 31 09:15:09 crc kubenswrapper[4783]: I0131 09:15:09.905089 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc0f6d8f-df21-48f3-817a-42e0363934cb" containerName="collect-profiles" Jan 31 09:15:09 crc kubenswrapper[4783]: I0131 09:15:09.905651 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ppxt9" Jan 31 09:15:09 crc kubenswrapper[4783]: I0131 09:15:09.911786 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-2lhkf" Jan 31 09:15:09 crc kubenswrapper[4783]: I0131 09:15:09.912248 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 31 09:15:09 crc kubenswrapper[4783]: I0131 09:15:09.920239 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ppxt9"] Jan 31 09:15:09 crc kubenswrapper[4783]: I0131 09:15:09.922778 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 31 09:15:09 crc kubenswrapper[4783]: I0131 09:15:09.965035 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9jxp\" (UniqueName: \"kubernetes.io/projected/d40b6e37-c50a-4419-8eb0-bcebde91bcb8-kube-api-access-q9jxp\") pod \"openstack-operator-index-ppxt9\" (UID: \"d40b6e37-c50a-4419-8eb0-bcebde91bcb8\") " pod="openstack-operators/openstack-operator-index-ppxt9" Jan 31 09:15:10 crc kubenswrapper[4783]: I0131 09:15:10.066603 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9jxp\" (UniqueName: \"kubernetes.io/projected/d40b6e37-c50a-4419-8eb0-bcebde91bcb8-kube-api-access-q9jxp\") pod \"openstack-operator-index-ppxt9\" (UID: \"d40b6e37-c50a-4419-8eb0-bcebde91bcb8\") " pod="openstack-operators/openstack-operator-index-ppxt9" Jan 31 09:15:10 crc kubenswrapper[4783]: I0131 09:15:10.084253 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9jxp\" (UniqueName: \"kubernetes.io/projected/d40b6e37-c50a-4419-8eb0-bcebde91bcb8-kube-api-access-q9jxp\") pod \"openstack-operator-index-ppxt9\" (UID: \"d40b6e37-c50a-4419-8eb0-bcebde91bcb8\") " pod="openstack-operators/openstack-operator-index-ppxt9" Jan 31 09:15:10 crc kubenswrapper[4783]: I0131 09:15:10.231570 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ppxt9" Jan 31 09:15:10 crc kubenswrapper[4783]: I0131 09:15:10.621489 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ppxt9"] Jan 31 09:15:10 crc kubenswrapper[4783]: W0131 09:15:10.624606 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd40b6e37_c50a_4419_8eb0_bcebde91bcb8.slice/crio-92c5f75152116193bcc58f6855b1c2a40bf1246d62b245ea1b64e14b7b3797de WatchSource:0}: Error finding container 92c5f75152116193bcc58f6855b1c2a40bf1246d62b245ea1b64e14b7b3797de: Status 404 returned error can't find the container with id 92c5f75152116193bcc58f6855b1c2a40bf1246d62b245ea1b64e14b7b3797de Jan 31 09:15:11 crc kubenswrapper[4783]: I0131 09:15:11.149682 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-x77mn" Jan 31 09:15:11 crc kubenswrapper[4783]: I0131 09:15:11.183694 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-x77mn" Jan 31 09:15:11 crc kubenswrapper[4783]: I0131 09:15:11.541740 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ppxt9" event={"ID":"d40b6e37-c50a-4419-8eb0-bcebde91bcb8","Type":"ContainerStarted","Data":"92c5f75152116193bcc58f6855b1c2a40bf1246d62b245ea1b64e14b7b3797de"} Jan 31 09:15:12 crc kubenswrapper[4783]: I0131 09:15:12.550963 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ppxt9" event={"ID":"d40b6e37-c50a-4419-8eb0-bcebde91bcb8","Type":"ContainerStarted","Data":"c73753d345489b28ca171e41b2780e610925d39b6602d72890ff720fc1a6bf02"} Jan 31 09:15:12 crc kubenswrapper[4783]: I0131 09:15:12.564709 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ppxt9" podStartSLOduration=2.345215037 podStartE2EDuration="3.564697503s" podCreationTimestamp="2026-01-31 09:15:09 +0000 UTC" firstStartedPulling="2026-01-31 09:15:10.626717449 +0000 UTC m=+621.295400917" lastFinishedPulling="2026-01-31 09:15:11.846199915 +0000 UTC m=+622.514883383" observedRunningTime="2026-01-31 09:15:12.56273665 +0000 UTC m=+623.231420118" watchObservedRunningTime="2026-01-31 09:15:12.564697503 +0000 UTC m=+623.233380971" Jan 31 09:15:13 crc kubenswrapper[4783]: I0131 09:15:13.088947 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-ppxt9"] Jan 31 09:15:13 crc kubenswrapper[4783]: I0131 09:15:13.693339 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-z2jkn"] Jan 31 09:15:13 crc kubenswrapper[4783]: I0131 09:15:13.694134 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z2jkn" Jan 31 09:15:13 crc kubenswrapper[4783]: I0131 09:15:13.701667 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z2jkn"] Jan 31 09:15:13 crc kubenswrapper[4783]: I0131 09:15:13.820154 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l8j2\" (UniqueName: \"kubernetes.io/projected/474ac824-d8f5-4d2b-9b6f-c385808d57b8-kube-api-access-9l8j2\") pod \"openstack-operator-index-z2jkn\" (UID: \"474ac824-d8f5-4d2b-9b6f-c385808d57b8\") " pod="openstack-operators/openstack-operator-index-z2jkn" Jan 31 09:15:13 crc kubenswrapper[4783]: I0131 09:15:13.922319 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l8j2\" (UniqueName: \"kubernetes.io/projected/474ac824-d8f5-4d2b-9b6f-c385808d57b8-kube-api-access-9l8j2\") pod \"openstack-operator-index-z2jkn\" (UID: \"474ac824-d8f5-4d2b-9b6f-c385808d57b8\") " pod="openstack-operators/openstack-operator-index-z2jkn" Jan 31 09:15:13 crc kubenswrapper[4783]: I0131 09:15:13.939884 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l8j2\" (UniqueName: \"kubernetes.io/projected/474ac824-d8f5-4d2b-9b6f-c385808d57b8-kube-api-access-9l8j2\") pod \"openstack-operator-index-z2jkn\" (UID: \"474ac824-d8f5-4d2b-9b6f-c385808d57b8\") " pod="openstack-operators/openstack-operator-index-z2jkn" Jan 31 09:15:14 crc kubenswrapper[4783]: I0131 09:15:14.023473 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z2jkn" Jan 31 09:15:14 crc kubenswrapper[4783]: I0131 09:15:14.400010 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z2jkn"] Jan 31 09:15:14 crc kubenswrapper[4783]: I0131 09:15:14.565935 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z2jkn" event={"ID":"474ac824-d8f5-4d2b-9b6f-c385808d57b8","Type":"ContainerStarted","Data":"b4d51ce508c55ba3bfbe879b39cfe59c83a52acee9caf61579d33e0a4ede135f"} Jan 31 09:15:14 crc kubenswrapper[4783]: I0131 09:15:14.566244 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-ppxt9" podUID="d40b6e37-c50a-4419-8eb0-bcebde91bcb8" containerName="registry-server" containerID="cri-o://c73753d345489b28ca171e41b2780e610925d39b6602d72890ff720fc1a6bf02" gracePeriod=2 Jan 31 09:15:14 crc kubenswrapper[4783]: I0131 09:15:14.857524 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ppxt9" Jan 31 09:15:15 crc kubenswrapper[4783]: I0131 09:15:15.037142 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9jxp\" (UniqueName: \"kubernetes.io/projected/d40b6e37-c50a-4419-8eb0-bcebde91bcb8-kube-api-access-q9jxp\") pod \"d40b6e37-c50a-4419-8eb0-bcebde91bcb8\" (UID: \"d40b6e37-c50a-4419-8eb0-bcebde91bcb8\") " Jan 31 09:15:15 crc kubenswrapper[4783]: I0131 09:15:15.041982 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d40b6e37-c50a-4419-8eb0-bcebde91bcb8-kube-api-access-q9jxp" (OuterVolumeSpecName: "kube-api-access-q9jxp") pod "d40b6e37-c50a-4419-8eb0-bcebde91bcb8" (UID: "d40b6e37-c50a-4419-8eb0-bcebde91bcb8"). InnerVolumeSpecName "kube-api-access-q9jxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:15:15 crc kubenswrapper[4783]: I0131 09:15:15.138700 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9jxp\" (UniqueName: \"kubernetes.io/projected/d40b6e37-c50a-4419-8eb0-bcebde91bcb8-kube-api-access-q9jxp\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:15 crc kubenswrapper[4783]: I0131 09:15:15.572365 4783 generic.go:334] "Generic (PLEG): container finished" podID="d40b6e37-c50a-4419-8eb0-bcebde91bcb8" containerID="c73753d345489b28ca171e41b2780e610925d39b6602d72890ff720fc1a6bf02" exitCode=0 Jan 31 09:15:15 crc kubenswrapper[4783]: I0131 09:15:15.572422 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ppxt9" event={"ID":"d40b6e37-c50a-4419-8eb0-bcebde91bcb8","Type":"ContainerDied","Data":"c73753d345489b28ca171e41b2780e610925d39b6602d72890ff720fc1a6bf02"} Jan 31 09:15:15 crc kubenswrapper[4783]: I0131 09:15:15.572474 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ppxt9" event={"ID":"d40b6e37-c50a-4419-8eb0-bcebde91bcb8","Type":"ContainerDied","Data":"92c5f75152116193bcc58f6855b1c2a40bf1246d62b245ea1b64e14b7b3797de"} Jan 31 09:15:15 crc kubenswrapper[4783]: I0131 09:15:15.572489 4783 scope.go:117] "RemoveContainer" containerID="c73753d345489b28ca171e41b2780e610925d39b6602d72890ff720fc1a6bf02" Jan 31 09:15:15 crc kubenswrapper[4783]: I0131 09:15:15.572469 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ppxt9" Jan 31 09:15:15 crc kubenswrapper[4783]: I0131 09:15:15.574106 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z2jkn" event={"ID":"474ac824-d8f5-4d2b-9b6f-c385808d57b8","Type":"ContainerStarted","Data":"322525a43f24c5b5a90a676821514866e4e9e0a2bdefe1527b33f62815cd0a08"} Jan 31 09:15:15 crc kubenswrapper[4783]: I0131 09:15:15.590131 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-z2jkn" podStartSLOduration=2.095387097 podStartE2EDuration="2.590120206s" podCreationTimestamp="2026-01-31 09:15:13 +0000 UTC" firstStartedPulling="2026-01-31 09:15:14.408322279 +0000 UTC m=+625.077005747" lastFinishedPulling="2026-01-31 09:15:14.903055388 +0000 UTC m=+625.571738856" observedRunningTime="2026-01-31 09:15:15.585685423 +0000 UTC m=+626.254368891" watchObservedRunningTime="2026-01-31 09:15:15.590120206 +0000 UTC m=+626.258803674" Jan 31 09:15:15 crc kubenswrapper[4783]: I0131 09:15:15.590528 4783 scope.go:117] "RemoveContainer" containerID="c73753d345489b28ca171e41b2780e610925d39b6602d72890ff720fc1a6bf02" Jan 31 09:15:15 crc kubenswrapper[4783]: E0131 09:15:15.590903 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c73753d345489b28ca171e41b2780e610925d39b6602d72890ff720fc1a6bf02\": container with ID starting with c73753d345489b28ca171e41b2780e610925d39b6602d72890ff720fc1a6bf02 not found: ID does not exist" containerID="c73753d345489b28ca171e41b2780e610925d39b6602d72890ff720fc1a6bf02" Jan 31 09:15:15 crc kubenswrapper[4783]: I0131 09:15:15.590938 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73753d345489b28ca171e41b2780e610925d39b6602d72890ff720fc1a6bf02"} err="failed to get container status \"c73753d345489b28ca171e41b2780e610925d39b6602d72890ff720fc1a6bf02\": rpc error: code = NotFound desc = could not find container \"c73753d345489b28ca171e41b2780e610925d39b6602d72890ff720fc1a6bf02\": container with ID starting with c73753d345489b28ca171e41b2780e610925d39b6602d72890ff720fc1a6bf02 not found: ID does not exist" Jan 31 09:15:15 crc kubenswrapper[4783]: I0131 09:15:15.599787 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-ppxt9"] Jan 31 09:15:15 crc kubenswrapper[4783]: I0131 09:15:15.602567 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-ppxt9"] Jan 31 09:15:15 crc kubenswrapper[4783]: I0131 09:15:15.653203 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d40b6e37-c50a-4419-8eb0-bcebde91bcb8" path="/var/lib/kubelet/pods/d40b6e37-c50a-4419-8eb0-bcebde91bcb8/volumes" Jan 31 09:15:16 crc kubenswrapper[4783]: I0131 09:15:16.156020 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-2q4pc" Jan 31 09:15:16 crc kubenswrapper[4783]: I0131 09:15:16.830625 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-qcxmw" Jan 31 09:15:24 crc kubenswrapper[4783]: I0131 09:15:24.023625 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-z2jkn" Jan 31 09:15:24 crc kubenswrapper[4783]: I0131 09:15:24.025077 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-z2jkn" Jan 31 09:15:24 crc kubenswrapper[4783]: I0131 09:15:24.052024 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-z2jkn" Jan 31 09:15:24 crc kubenswrapper[4783]: I0131 09:15:24.651229 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-z2jkn" Jan 31 09:15:26 crc kubenswrapper[4783]: I0131 09:15:26.152111 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-x77mn" Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.151981 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv"] Jan 31 09:15:30 crc kubenswrapper[4783]: E0131 09:15:30.152756 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40b6e37-c50a-4419-8eb0-bcebde91bcb8" containerName="registry-server" Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.152771 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40b6e37-c50a-4419-8eb0-bcebde91bcb8" containerName="registry-server" Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.152873 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="d40b6e37-c50a-4419-8eb0-bcebde91bcb8" containerName="registry-server" Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.153633 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.155696 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-xlkjm" Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.163664 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv"] Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.319502 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvcfw\" (UniqueName: \"kubernetes.io/projected/297a5b92-55db-4a84-8bd3-878ea32367df-kube-api-access-jvcfw\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv\" (UID: \"297a5b92-55db-4a84-8bd3-878ea32367df\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.319561 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/297a5b92-55db-4a84-8bd3-878ea32367df-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv\" (UID: \"297a5b92-55db-4a84-8bd3-878ea32367df\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.319605 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/297a5b92-55db-4a84-8bd3-878ea32367df-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv\" (UID: \"297a5b92-55db-4a84-8bd3-878ea32367df\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.420649 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/297a5b92-55db-4a84-8bd3-878ea32367df-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv\" (UID: \"297a5b92-55db-4a84-8bd3-878ea32367df\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.420721 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/297a5b92-55db-4a84-8bd3-878ea32367df-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv\" (UID: \"297a5b92-55db-4a84-8bd3-878ea32367df\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.420776 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvcfw\" (UniqueName: \"kubernetes.io/projected/297a5b92-55db-4a84-8bd3-878ea32367df-kube-api-access-jvcfw\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv\" (UID: \"297a5b92-55db-4a84-8bd3-878ea32367df\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.421213 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/297a5b92-55db-4a84-8bd3-878ea32367df-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv\" (UID: \"297a5b92-55db-4a84-8bd3-878ea32367df\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.421285 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/297a5b92-55db-4a84-8bd3-878ea32367df-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv\" (UID: \"297a5b92-55db-4a84-8bd3-878ea32367df\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.438977 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvcfw\" (UniqueName: \"kubernetes.io/projected/297a5b92-55db-4a84-8bd3-878ea32367df-kube-api-access-jvcfw\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv\" (UID: \"297a5b92-55db-4a84-8bd3-878ea32367df\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.468610 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" Jan 31 09:15:30 crc kubenswrapper[4783]: I0131 09:15:30.828524 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv"] Jan 31 09:15:30 crc kubenswrapper[4783]: W0131 09:15:30.833221 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod297a5b92_55db_4a84_8bd3_878ea32367df.slice/crio-2e4dd22c8e6aed60485648537530be8c22795a1e726f6ef1d1eca9b35d3eced5 WatchSource:0}: Error finding container 2e4dd22c8e6aed60485648537530be8c22795a1e726f6ef1d1eca9b35d3eced5: Status 404 returned error can't find the container with id 2e4dd22c8e6aed60485648537530be8c22795a1e726f6ef1d1eca9b35d3eced5 Jan 31 09:15:31 crc kubenswrapper[4783]: I0131 09:15:31.660154 4783 generic.go:334] "Generic (PLEG): container finished" podID="297a5b92-55db-4a84-8bd3-878ea32367df" containerID="eb0b3b18b1dd6ef3e367302cbda531d7cb00eeaeecce4ed7fce18e5220e32e48" exitCode=0 Jan 31 09:15:31 crc kubenswrapper[4783]: I0131 09:15:31.660210 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" event={"ID":"297a5b92-55db-4a84-8bd3-878ea32367df","Type":"ContainerDied","Data":"eb0b3b18b1dd6ef3e367302cbda531d7cb00eeaeecce4ed7fce18e5220e32e48"} Jan 31 09:15:31 crc kubenswrapper[4783]: I0131 09:15:31.660229 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" event={"ID":"297a5b92-55db-4a84-8bd3-878ea32367df","Type":"ContainerStarted","Data":"2e4dd22c8e6aed60485648537530be8c22795a1e726f6ef1d1eca9b35d3eced5"} Jan 31 09:15:33 crc kubenswrapper[4783]: I0131 09:15:33.671505 4783 generic.go:334] "Generic (PLEG): container finished" podID="297a5b92-55db-4a84-8bd3-878ea32367df" containerID="474114dbc78bcf7c5afb36181795aec1d06860e44e1d01e34a4057b62178a7ba" exitCode=0 Jan 31 09:15:33 crc kubenswrapper[4783]: I0131 09:15:33.671537 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" event={"ID":"297a5b92-55db-4a84-8bd3-878ea32367df","Type":"ContainerDied","Data":"474114dbc78bcf7c5afb36181795aec1d06860e44e1d01e34a4057b62178a7ba"} Jan 31 09:15:34 crc kubenswrapper[4783]: I0131 09:15:34.678597 4783 generic.go:334] "Generic (PLEG): container finished" podID="297a5b92-55db-4a84-8bd3-878ea32367df" containerID="50f8ddec8dcf9312494cbc9e9aea542653e932471cf4195ab5f9c903f3cc7e7f" exitCode=0 Jan 31 09:15:34 crc kubenswrapper[4783]: I0131 09:15:34.678647 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" event={"ID":"297a5b92-55db-4a84-8bd3-878ea32367df","Type":"ContainerDied","Data":"50f8ddec8dcf9312494cbc9e9aea542653e932471cf4195ab5f9c903f3cc7e7f"} Jan 31 09:15:35 crc kubenswrapper[4783]: I0131 09:15:35.879189 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" Jan 31 09:15:35 crc kubenswrapper[4783]: I0131 09:15:35.983337 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/297a5b92-55db-4a84-8bd3-878ea32367df-util\") pod \"297a5b92-55db-4a84-8bd3-878ea32367df\" (UID: \"297a5b92-55db-4a84-8bd3-878ea32367df\") " Jan 31 09:15:35 crc kubenswrapper[4783]: I0131 09:15:35.983422 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/297a5b92-55db-4a84-8bd3-878ea32367df-bundle\") pod \"297a5b92-55db-4a84-8bd3-878ea32367df\" (UID: \"297a5b92-55db-4a84-8bd3-878ea32367df\") " Jan 31 09:15:35 crc kubenswrapper[4783]: I0131 09:15:35.983663 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvcfw\" (UniqueName: \"kubernetes.io/projected/297a5b92-55db-4a84-8bd3-878ea32367df-kube-api-access-jvcfw\") pod \"297a5b92-55db-4a84-8bd3-878ea32367df\" (UID: \"297a5b92-55db-4a84-8bd3-878ea32367df\") " Jan 31 09:15:35 crc kubenswrapper[4783]: I0131 09:15:35.985798 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/297a5b92-55db-4a84-8bd3-878ea32367df-bundle" (OuterVolumeSpecName: "bundle") pod "297a5b92-55db-4a84-8bd3-878ea32367df" (UID: "297a5b92-55db-4a84-8bd3-878ea32367df"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:15:35 crc kubenswrapper[4783]: I0131 09:15:35.995009 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/297a5b92-55db-4a84-8bd3-878ea32367df-util" (OuterVolumeSpecName: "util") pod "297a5b92-55db-4a84-8bd3-878ea32367df" (UID: "297a5b92-55db-4a84-8bd3-878ea32367df"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:15:35 crc kubenswrapper[4783]: I0131 09:15:35.995817 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/297a5b92-55db-4a84-8bd3-878ea32367df-kube-api-access-jvcfw" (OuterVolumeSpecName: "kube-api-access-jvcfw") pod "297a5b92-55db-4a84-8bd3-878ea32367df" (UID: "297a5b92-55db-4a84-8bd3-878ea32367df"). InnerVolumeSpecName "kube-api-access-jvcfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:15:36 crc kubenswrapper[4783]: I0131 09:15:36.087545 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvcfw\" (UniqueName: \"kubernetes.io/projected/297a5b92-55db-4a84-8bd3-878ea32367df-kube-api-access-jvcfw\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:36 crc kubenswrapper[4783]: I0131 09:15:36.087572 4783 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/297a5b92-55db-4a84-8bd3-878ea32367df-util\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:36 crc kubenswrapper[4783]: I0131 09:15:36.087583 4783 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/297a5b92-55db-4a84-8bd3-878ea32367df-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:36 crc kubenswrapper[4783]: I0131 09:15:36.689441 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" event={"ID":"297a5b92-55db-4a84-8bd3-878ea32367df","Type":"ContainerDied","Data":"2e4dd22c8e6aed60485648537530be8c22795a1e726f6ef1d1eca9b35d3eced5"} Jan 31 09:15:36 crc kubenswrapper[4783]: I0131 09:15:36.689626 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e4dd22c8e6aed60485648537530be8c22795a1e726f6ef1d1eca9b35d3eced5" Jan 31 09:15:36 crc kubenswrapper[4783]: I0131 09:15:36.689504 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv" Jan 31 09:15:42 crc kubenswrapper[4783]: I0131 09:15:42.215304 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-s5jcf"] Jan 31 09:15:42 crc kubenswrapper[4783]: E0131 09:15:42.215666 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="297a5b92-55db-4a84-8bd3-878ea32367df" containerName="extract" Jan 31 09:15:42 crc kubenswrapper[4783]: I0131 09:15:42.215819 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="297a5b92-55db-4a84-8bd3-878ea32367df" containerName="extract" Jan 31 09:15:42 crc kubenswrapper[4783]: E0131 09:15:42.215834 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="297a5b92-55db-4a84-8bd3-878ea32367df" containerName="util" Jan 31 09:15:42 crc kubenswrapper[4783]: I0131 09:15:42.215839 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="297a5b92-55db-4a84-8bd3-878ea32367df" containerName="util" Jan 31 09:15:42 crc kubenswrapper[4783]: E0131 09:15:42.215849 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="297a5b92-55db-4a84-8bd3-878ea32367df" containerName="pull" Jan 31 09:15:42 crc kubenswrapper[4783]: I0131 09:15:42.215855 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="297a5b92-55db-4a84-8bd3-878ea32367df" containerName="pull" Jan 31 09:15:42 crc kubenswrapper[4783]: I0131 09:15:42.215949 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="297a5b92-55db-4a84-8bd3-878ea32367df" containerName="extract" Jan 31 09:15:42 crc kubenswrapper[4783]: I0131 09:15:42.216378 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-s5jcf" Jan 31 09:15:42 crc kubenswrapper[4783]: I0131 09:15:42.217647 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-xwdk5" Jan 31 09:15:42 crc kubenswrapper[4783]: I0131 09:15:42.233382 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-s5jcf"] Jan 31 09:15:42 crc kubenswrapper[4783]: I0131 09:15:42.356018 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc95h\" (UniqueName: \"kubernetes.io/projected/5cdb2757-415c-4ffe-bcb1-0c07dfeee1ab-kube-api-access-jc95h\") pod \"openstack-operator-controller-init-757f46c65d-s5jcf\" (UID: \"5cdb2757-415c-4ffe-bcb1-0c07dfeee1ab\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-s5jcf" Jan 31 09:15:42 crc kubenswrapper[4783]: I0131 09:15:42.456847 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc95h\" (UniqueName: \"kubernetes.io/projected/5cdb2757-415c-4ffe-bcb1-0c07dfeee1ab-kube-api-access-jc95h\") pod \"openstack-operator-controller-init-757f46c65d-s5jcf\" (UID: \"5cdb2757-415c-4ffe-bcb1-0c07dfeee1ab\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-s5jcf" Jan 31 09:15:42 crc kubenswrapper[4783]: I0131 09:15:42.472427 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc95h\" (UniqueName: \"kubernetes.io/projected/5cdb2757-415c-4ffe-bcb1-0c07dfeee1ab-kube-api-access-jc95h\") pod \"openstack-operator-controller-init-757f46c65d-s5jcf\" (UID: \"5cdb2757-415c-4ffe-bcb1-0c07dfeee1ab\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-s5jcf" Jan 31 09:15:42 crc kubenswrapper[4783]: I0131 09:15:42.528819 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-s5jcf" Jan 31 09:15:42 crc kubenswrapper[4783]: I0131 09:15:42.878656 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-s5jcf"] Jan 31 09:15:43 crc kubenswrapper[4783]: I0131 09:15:43.721020 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-s5jcf" event={"ID":"5cdb2757-415c-4ffe-bcb1-0c07dfeee1ab","Type":"ContainerStarted","Data":"73d6465ec7a14f74ac656dd597e0e218b58f3e9e7fc06c1925d08cf419022714"} Jan 31 09:15:47 crc kubenswrapper[4783]: I0131 09:15:47.746805 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-s5jcf" event={"ID":"5cdb2757-415c-4ffe-bcb1-0c07dfeee1ab","Type":"ContainerStarted","Data":"d6e2761f5b2c8403252fd90e2bee72764bffe24bf5128636c6f1851cee4017ca"} Jan 31 09:15:47 crc kubenswrapper[4783]: I0131 09:15:47.747322 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-s5jcf" Jan 31 09:15:47 crc kubenswrapper[4783]: I0131 09:15:47.768253 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-s5jcf" podStartSLOduration=1.58164813 podStartE2EDuration="5.7682378s" podCreationTimestamp="2026-01-31 09:15:42 +0000 UTC" firstStartedPulling="2026-01-31 09:15:42.884794282 +0000 UTC m=+653.553477750" lastFinishedPulling="2026-01-31 09:15:47.071383961 +0000 UTC m=+657.740067420" observedRunningTime="2026-01-31 09:15:47.765463132 +0000 UTC m=+658.434146600" watchObservedRunningTime="2026-01-31 09:15:47.7682378 +0000 UTC m=+658.436921268" Jan 31 09:15:52 crc kubenswrapper[4783]: I0131 09:15:52.530806 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-s5jcf" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.785003 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-gwkhk"] Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.792980 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-gwkhk" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.813586 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-9l758" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.843244 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-gwkhk"] Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.849318 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-vjgpf"] Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.851369 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-vjgpf" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.854590 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-xzt2h" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.859844 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-k9948"] Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.860760 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-k9948" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.862908 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-qngs7" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.863708 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-vjgpf"] Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.878003 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-k9948"] Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.897236 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-btfdw"] Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.898182 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-btfdw" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.899505 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-v7q29" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.909219 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-btfdw"] Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.918136 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-4nwdm"] Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.919085 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4nwdm" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.920415 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-ln9h6"] Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.921359 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-ln9h6" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.924993 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-gbx9c" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.925136 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-cccnr" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.929569 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-4nwdm"] Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.951026 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-rn96f"] Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.951824 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.955688 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-82z66"] Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.956111 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-k6lt8" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.956335 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.956691 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-82z66" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.958632 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-rn96f"] Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.958919 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-n96s4" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.959503 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82ffc\" (UniqueName: \"kubernetes.io/projected/99914340-4708-4322-996f-7392f6fe6e02-kube-api-access-82ffc\") pod \"cinder-operator-controller-manager-8d874c8fc-vjgpf\" (UID: \"99914340-4708-4322-996f-7392f6fe6e02\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-vjgpf" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.959551 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z6fp\" (UniqueName: \"kubernetes.io/projected/7ddb9dd0-fc57-4685-a7d5-778a4152ea58-kube-api-access-6z6fp\") pod \"designate-operator-controller-manager-6d9697b7f4-k9948\" (UID: \"7ddb9dd0-fc57-4685-a7d5-778a4152ea58\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-k9948" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.959618 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbm7w\" (UniqueName: \"kubernetes.io/projected/3a079322-76ea-4cb9-a8b6-3f0b1a360086-kube-api-access-tbm7w\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-gwkhk\" (UID: \"3a079322-76ea-4cb9-a8b6-3f0b1a360086\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-gwkhk" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.963681 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-ln9h6"] Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.971582 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-t7nxg"] Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.972141 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-t7nxg" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.978261 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-qhmbh" Jan 31 09:16:11 crc kubenswrapper[4783]: I0131 09:16:11.991867 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-82z66"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:11.999570 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-t7nxg"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.006865 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-97h8r"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.007804 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-97h8r" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.009750 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-n55xx" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.020468 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-97h8r"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.029147 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-bxv5z"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.030062 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-bxv5z" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.033505 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-nzwvf" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.042313 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-bxv5z"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.044689 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-fvml6"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.045480 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-fvml6" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.053236 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-jc8zq" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.058154 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-tmf9x"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.059004 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-tmf9x" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.060524 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-dbpms" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.061795 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82ffc\" (UniqueName: \"kubernetes.io/projected/99914340-4708-4322-996f-7392f6fe6e02-kube-api-access-82ffc\") pod \"cinder-operator-controller-manager-8d874c8fc-vjgpf\" (UID: \"99914340-4708-4322-996f-7392f6fe6e02\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-vjgpf" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.061890 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z6fp\" (UniqueName: \"kubernetes.io/projected/7ddb9dd0-fc57-4685-a7d5-778a4152ea58-kube-api-access-6z6fp\") pod \"designate-operator-controller-manager-6d9697b7f4-k9948\" (UID: \"7ddb9dd0-fc57-4685-a7d5-778a4152ea58\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-k9948" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.061987 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sx7w\" (UniqueName: \"kubernetes.io/projected/1de23cab-104f-49ac-ab9f-3b1d08733ff9-kube-api-access-4sx7w\") pod \"horizon-operator-controller-manager-5fb775575f-4nwdm\" (UID: \"1de23cab-104f-49ac-ab9f-3b1d08733ff9\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4nwdm" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.062072 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgfbh\" (UniqueName: \"kubernetes.io/projected/5033c800-ef69-4228-a204-b66401c4725c-kube-api-access-mgfbh\") pod \"infra-operator-controller-manager-79955696d6-rn96f\" (UID: \"5033c800-ef69-4228-a204-b66401c4725c\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.062205 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94mwt\" (UniqueName: \"kubernetes.io/projected/bbcb9fbf-dab0-4029-b8a8-9e6f13bdf352-kube-api-access-94mwt\") pod \"ironic-operator-controller-manager-5f4b8bd54d-82z66\" (UID: \"bbcb9fbf-dab0-4029-b8a8-9e6f13bdf352\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-82z66" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.062292 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbm7w\" (UniqueName: \"kubernetes.io/projected/3a079322-76ea-4cb9-a8b6-3f0b1a360086-kube-api-access-tbm7w\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-gwkhk\" (UID: \"3a079322-76ea-4cb9-a8b6-3f0b1a360086\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-gwkhk" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.062861 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgjwh\" (UniqueName: \"kubernetes.io/projected/1e1e07c3-0aeb-47fd-be71-a13716a04f29-kube-api-access-wgjwh\") pod \"heat-operator-controller-manager-69d6db494d-ln9h6\" (UID: \"1e1e07c3-0aeb-47fd-be71-a13716a04f29\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-ln9h6" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.062941 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g76v8\" (UniqueName: \"kubernetes.io/projected/e4606683-7a0b-4a0f-ae81-3c6e598a36e6-kube-api-access-g76v8\") pod \"glance-operator-controller-manager-8886f4c47-btfdw\" (UID: \"e4606683-7a0b-4a0f-ae81-3c6e598a36e6\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-btfdw" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.063007 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert\") pod \"infra-operator-controller-manager-79955696d6-rn96f\" (UID: \"5033c800-ef69-4228-a204-b66401c4725c\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.064026 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-fvml6"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.081660 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-tmf9x"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.084741 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbm7w\" (UniqueName: \"kubernetes.io/projected/3a079322-76ea-4cb9-a8b6-3f0b1a360086-kube-api-access-tbm7w\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-gwkhk\" (UID: \"3a079322-76ea-4cb9-a8b6-3f0b1a360086\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-gwkhk" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.090070 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-s842k"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.090884 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s842k" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.091994 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z6fp\" (UniqueName: \"kubernetes.io/projected/7ddb9dd0-fc57-4685-a7d5-778a4152ea58-kube-api-access-6z6fp\") pod \"designate-operator-controller-manager-6d9697b7f4-k9948\" (UID: \"7ddb9dd0-fc57-4685-a7d5-778a4152ea58\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-k9948" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.092081 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82ffc\" (UniqueName: \"kubernetes.io/projected/99914340-4708-4322-996f-7392f6fe6e02-kube-api-access-82ffc\") pod \"cinder-operator-controller-manager-8d874c8fc-vjgpf\" (UID: \"99914340-4708-4322-996f-7392f6fe6e02\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-vjgpf" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.113809 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-wspmw" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.113944 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-s842k"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.132711 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-gwkhk" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.165222 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.166140 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.168014 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94mwt\" (UniqueName: \"kubernetes.io/projected/bbcb9fbf-dab0-4029-b8a8-9e6f13bdf352-kube-api-access-94mwt\") pod \"ironic-operator-controller-manager-5f4b8bd54d-82z66\" (UID: \"bbcb9fbf-dab0-4029-b8a8-9e6f13bdf352\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-82z66" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.168058 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dx96\" (UniqueName: \"kubernetes.io/projected/8178283d-c10c-45e6-a465-bdb5096d8904-kube-api-access-4dx96\") pod \"nova-operator-controller-manager-55bff696bd-tmf9x\" (UID: \"8178283d-c10c-45e6-a465-bdb5096d8904\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-tmf9x" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.168089 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgjwh\" (UniqueName: \"kubernetes.io/projected/1e1e07c3-0aeb-47fd-be71-a13716a04f29-kube-api-access-wgjwh\") pod \"heat-operator-controller-manager-69d6db494d-ln9h6\" (UID: \"1e1e07c3-0aeb-47fd-be71-a13716a04f29\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-ln9h6" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.168108 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g76v8\" (UniqueName: \"kubernetes.io/projected/e4606683-7a0b-4a0f-ae81-3c6e598a36e6-kube-api-access-g76v8\") pod \"glance-operator-controller-manager-8886f4c47-btfdw\" (UID: \"e4606683-7a0b-4a0f-ae81-3c6e598a36e6\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-btfdw" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.168127 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert\") pod \"infra-operator-controller-manager-79955696d6-rn96f\" (UID: \"5033c800-ef69-4228-a204-b66401c4725c\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.168150 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f769\" (UniqueName: \"kubernetes.io/projected/8fdee142-92ef-49d1-bac6-6f6c3873b2cb-kube-api-access-6f769\") pod \"manila-operator-controller-manager-7dd968899f-97h8r\" (UID: \"8fdee142-92ef-49d1-bac6-6f6c3873b2cb\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-97h8r" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.168212 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sx7w\" (UniqueName: \"kubernetes.io/projected/1de23cab-104f-49ac-ab9f-3b1d08733ff9-kube-api-access-4sx7w\") pod \"horizon-operator-controller-manager-5fb775575f-4nwdm\" (UID: \"1de23cab-104f-49ac-ab9f-3b1d08733ff9\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4nwdm" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.168232 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr5lq\" (UniqueName: \"kubernetes.io/projected/cd28073a-c4f4-4b1d-9680-a9d5a5939deb-kube-api-access-vr5lq\") pod \"keystone-operator-controller-manager-84f48565d4-t7nxg\" (UID: \"cd28073a-c4f4-4b1d-9680-a9d5a5939deb\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-t7nxg" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.168265 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgfbh\" (UniqueName: \"kubernetes.io/projected/5033c800-ef69-4228-a204-b66401c4725c-kube-api-access-mgfbh\") pod \"infra-operator-controller-manager-79955696d6-rn96f\" (UID: \"5033c800-ef69-4228-a204-b66401c4725c\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.168299 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjcm5\" (UniqueName: \"kubernetes.io/projected/acb9dbfe-e754-4021-bc54-7ccd17b217a4-kube-api-access-kjcm5\") pod \"mariadb-operator-controller-manager-67bf948998-bxv5z\" (UID: \"acb9dbfe-e754-4021-bc54-7ccd17b217a4\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-bxv5z" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.168315 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l4z8\" (UniqueName: \"kubernetes.io/projected/be9f1345-8ca5-49da-a52e-4b841ea07ac3-kube-api-access-4l4z8\") pod \"neutron-operator-controller-manager-585dbc889-fvml6\" (UID: \"be9f1345-8ca5-49da-a52e-4b841ea07ac3\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-fvml6" Jan 31 09:16:12 crc kubenswrapper[4783]: E0131 09:16:12.168837 4783 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 31 09:16:12 crc kubenswrapper[4783]: E0131 09:16:12.168879 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert podName:5033c800-ef69-4228-a204-b66401c4725c nodeName:}" failed. No retries permitted until 2026-01-31 09:16:12.66886401 +0000 UTC m=+683.337547479 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert") pod "infra-operator-controller-manager-79955696d6-rn96f" (UID: "5033c800-ef69-4228-a204-b66401c4725c") : secret "infra-operator-webhook-server-cert" not found Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.169422 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.169467 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-4mndg" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.169820 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-vjgpf" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.176773 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-x86p2"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.177676 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-x86p2" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.188693 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-x6rmh"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.189725 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-x6rmh" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.190312 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-k9948" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.193036 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-8sbcn" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.205419 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-7wrvq" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.205621 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.220721 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g76v8\" (UniqueName: \"kubernetes.io/projected/e4606683-7a0b-4a0f-ae81-3c6e598a36e6-kube-api-access-g76v8\") pod \"glance-operator-controller-manager-8886f4c47-btfdw\" (UID: \"e4606683-7a0b-4a0f-ae81-3c6e598a36e6\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-btfdw" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.221208 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94mwt\" (UniqueName: \"kubernetes.io/projected/bbcb9fbf-dab0-4029-b8a8-9e6f13bdf352-kube-api-access-94mwt\") pod \"ironic-operator-controller-manager-5f4b8bd54d-82z66\" (UID: \"bbcb9fbf-dab0-4029-b8a8-9e6f13bdf352\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-82z66" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.223124 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgjwh\" (UniqueName: \"kubernetes.io/projected/1e1e07c3-0aeb-47fd-be71-a13716a04f29-kube-api-access-wgjwh\") pod \"heat-operator-controller-manager-69d6db494d-ln9h6\" (UID: \"1e1e07c3-0aeb-47fd-be71-a13716a04f29\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-ln9h6" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.227683 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgfbh\" (UniqueName: \"kubernetes.io/projected/5033c800-ef69-4228-a204-b66401c4725c-kube-api-access-mgfbh\") pod \"infra-operator-controller-manager-79955696d6-rn96f\" (UID: \"5033c800-ef69-4228-a204-b66401c4725c\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.227748 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-x86p2"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.234763 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sx7w\" (UniqueName: \"kubernetes.io/projected/1de23cab-104f-49ac-ab9f-3b1d08733ff9-kube-api-access-4sx7w\") pod \"horizon-operator-controller-manager-5fb775575f-4nwdm\" (UID: \"1de23cab-104f-49ac-ab9f-3b1d08733ff9\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4nwdm" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.239581 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-btfdw" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.248326 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4nwdm" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.261131 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-ln9h6" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.275042 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-x6rmh"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.275214 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-5hw2c"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.277966 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5hw2c" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.278413 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-5hw2c"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.279965 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp\" (UID: \"10649b85-8b3a-44e2-9477-6f5821d232a7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.280605 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dx96\" (UniqueName: \"kubernetes.io/projected/8178283d-c10c-45e6-a465-bdb5096d8904-kube-api-access-4dx96\") pod \"nova-operator-controller-manager-55bff696bd-tmf9x\" (UID: \"8178283d-c10c-45e6-a465-bdb5096d8904\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-tmf9x" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.280675 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzqxw\" (UniqueName: \"kubernetes.io/projected/fa110500-5ebd-4645-86a1-e3bf4b9780fe-kube-api-access-nzqxw\") pod \"octavia-operator-controller-manager-6687f8d877-s842k\" (UID: \"fa110500-5ebd-4645-86a1-e3bf4b9780fe\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s842k" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.280707 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f769\" (UniqueName: \"kubernetes.io/projected/8fdee142-92ef-49d1-bac6-6f6c3873b2cb-kube-api-access-6f769\") pod \"manila-operator-controller-manager-7dd968899f-97h8r\" (UID: \"8fdee142-92ef-49d1-bac6-6f6c3873b2cb\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-97h8r" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.280732 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp6ct\" (UniqueName: \"kubernetes.io/projected/caf4a0bd-2f55-4185-a756-4a640cbfe8d3-kube-api-access-cp6ct\") pod \"ovn-operator-controller-manager-788c46999f-x86p2\" (UID: \"caf4a0bd-2f55-4185-a756-4a640cbfe8d3\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-x86p2" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.280775 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlq7w\" (UniqueName: \"kubernetes.io/projected/10649b85-8b3a-44e2-9477-6f5821d232a7-kube-api-access-vlq7w\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp\" (UID: \"10649b85-8b3a-44e2-9477-6f5821d232a7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.280820 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr5lq\" (UniqueName: \"kubernetes.io/projected/cd28073a-c4f4-4b1d-9680-a9d5a5939deb-kube-api-access-vr5lq\") pod \"keystone-operator-controller-manager-84f48565d4-t7nxg\" (UID: \"cd28073a-c4f4-4b1d-9680-a9d5a5939deb\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-t7nxg" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.280932 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjcm5\" (UniqueName: \"kubernetes.io/projected/acb9dbfe-e754-4021-bc54-7ccd17b217a4-kube-api-access-kjcm5\") pod \"mariadb-operator-controller-manager-67bf948998-bxv5z\" (UID: \"acb9dbfe-e754-4021-bc54-7ccd17b217a4\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-bxv5z" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.280965 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l4z8\" (UniqueName: \"kubernetes.io/projected/be9f1345-8ca5-49da-a52e-4b841ea07ac3-kube-api-access-4l4z8\") pod \"neutron-operator-controller-manager-585dbc889-fvml6\" (UID: \"be9f1345-8ca5-49da-a52e-4b841ea07ac3\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-fvml6" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.281479 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-n4fwb" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.284139 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-82z66" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.317886 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr5lq\" (UniqueName: \"kubernetes.io/projected/cd28073a-c4f4-4b1d-9680-a9d5a5939deb-kube-api-access-vr5lq\") pod \"keystone-operator-controller-manager-84f48565d4-t7nxg\" (UID: \"cd28073a-c4f4-4b1d-9680-a9d5a5939deb\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-t7nxg" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.325687 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l4z8\" (UniqueName: \"kubernetes.io/projected/be9f1345-8ca5-49da-a52e-4b841ea07ac3-kube-api-access-4l4z8\") pod \"neutron-operator-controller-manager-585dbc889-fvml6\" (UID: \"be9f1345-8ca5-49da-a52e-4b841ea07ac3\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-fvml6" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.326655 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dx96\" (UniqueName: \"kubernetes.io/projected/8178283d-c10c-45e6-a465-bdb5096d8904-kube-api-access-4dx96\") pod \"nova-operator-controller-manager-55bff696bd-tmf9x\" (UID: \"8178283d-c10c-45e6-a465-bdb5096d8904\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-tmf9x" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.355505 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjcm5\" (UniqueName: \"kubernetes.io/projected/acb9dbfe-e754-4021-bc54-7ccd17b217a4-kube-api-access-kjcm5\") pod \"mariadb-operator-controller-manager-67bf948998-bxv5z\" (UID: \"acb9dbfe-e754-4021-bc54-7ccd17b217a4\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-bxv5z" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.357242 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-8dwmb"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.358106 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-8dwmb" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.359151 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f769\" (UniqueName: \"kubernetes.io/projected/8fdee142-92ef-49d1-bac6-6f6c3873b2cb-kube-api-access-6f769\") pod \"manila-operator-controller-manager-7dd968899f-97h8r\" (UID: \"8fdee142-92ef-49d1-bac6-6f6c3873b2cb\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-97h8r" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.360003 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-bxv5z" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.361874 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-n46bb" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.364679 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-8dwmb"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.371978 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-fvml6" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.375945 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-9d95x"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.376552 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-9d95x" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.379856 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-9d95x"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.380130 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-tmf9x" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.380718 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-hq6dm" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.381485 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp6ct\" (UniqueName: \"kubernetes.io/projected/caf4a0bd-2f55-4185-a756-4a640cbfe8d3-kube-api-access-cp6ct\") pod \"ovn-operator-controller-manager-788c46999f-x86p2\" (UID: \"caf4a0bd-2f55-4185-a756-4a640cbfe8d3\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-x86p2" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.381583 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4l8j\" (UniqueName: \"kubernetes.io/projected/c42112c2-917b-491b-9a4a-5253a0fc8d09-kube-api-access-v4l8j\") pod \"swift-operator-controller-manager-68fc8c869-5hw2c\" (UID: \"c42112c2-917b-491b-9a4a-5253a0fc8d09\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5hw2c" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.381672 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlq7w\" (UniqueName: \"kubernetes.io/projected/10649b85-8b3a-44e2-9477-6f5821d232a7-kube-api-access-vlq7w\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp\" (UID: \"10649b85-8b3a-44e2-9477-6f5821d232a7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.381746 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbtnw\" (UniqueName: \"kubernetes.io/projected/8ad3860f-1b60-4522-8026-08212156646d-kube-api-access-hbtnw\") pod \"telemetry-operator-controller-manager-64b5b76f97-8dwmb\" (UID: \"8ad3860f-1b60-4522-8026-08212156646d\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-8dwmb" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.381821 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9p5h\" (UniqueName: \"kubernetes.io/projected/d6255583-1dd4-4901-b3af-8619aa03434b-kube-api-access-b9p5h\") pod \"test-operator-controller-manager-56f8bfcd9f-9d95x\" (UID: \"d6255583-1dd4-4901-b3af-8619aa03434b\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-9d95x" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.381906 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57jjj\" (UniqueName: \"kubernetes.io/projected/5bc18646-6e2c-41b5-8690-b6b7eda1a8cc-kube-api-access-57jjj\") pod \"placement-operator-controller-manager-5b964cf4cd-x6rmh\" (UID: \"5bc18646-6e2c-41b5-8690-b6b7eda1a8cc\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-x6rmh" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.381979 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp\" (UID: \"10649b85-8b3a-44e2-9477-6f5821d232a7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.382065 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzqxw\" (UniqueName: \"kubernetes.io/projected/fa110500-5ebd-4645-86a1-e3bf4b9780fe-kube-api-access-nzqxw\") pod \"octavia-operator-controller-manager-6687f8d877-s842k\" (UID: \"fa110500-5ebd-4645-86a1-e3bf4b9780fe\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s842k" Jan 31 09:16:12 crc kubenswrapper[4783]: E0131 09:16:12.385028 4783 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:16:12 crc kubenswrapper[4783]: E0131 09:16:12.385074 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert podName:10649b85-8b3a-44e2-9477-6f5821d232a7 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:12.88505844 +0000 UTC m=+683.553741897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" (UID: "10649b85-8b3a-44e2-9477-6f5821d232a7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.407493 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlq7w\" (UniqueName: \"kubernetes.io/projected/10649b85-8b3a-44e2-9477-6f5821d232a7-kube-api-access-vlq7w\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp\" (UID: \"10649b85-8b3a-44e2-9477-6f5821d232a7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.410641 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp6ct\" (UniqueName: \"kubernetes.io/projected/caf4a0bd-2f55-4185-a756-4a640cbfe8d3-kube-api-access-cp6ct\") pod \"ovn-operator-controller-manager-788c46999f-x86p2\" (UID: \"caf4a0bd-2f55-4185-a756-4a640cbfe8d3\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-x86p2" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.411154 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzqxw\" (UniqueName: \"kubernetes.io/projected/fa110500-5ebd-4645-86a1-e3bf4b9780fe-kube-api-access-nzqxw\") pod \"octavia-operator-controller-manager-6687f8d877-s842k\" (UID: \"fa110500-5ebd-4645-86a1-e3bf4b9780fe\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s842k" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.417537 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-sbgw8"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.418472 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-sbgw8" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.421188 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-4zxxt" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.430574 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-sbgw8"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.441524 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s842k" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.482664 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57jjj\" (UniqueName: \"kubernetes.io/projected/5bc18646-6e2c-41b5-8690-b6b7eda1a8cc-kube-api-access-57jjj\") pod \"placement-operator-controller-manager-5b964cf4cd-x6rmh\" (UID: \"5bc18646-6e2c-41b5-8690-b6b7eda1a8cc\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-x6rmh" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.482747 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4l8j\" (UniqueName: \"kubernetes.io/projected/c42112c2-917b-491b-9a4a-5253a0fc8d09-kube-api-access-v4l8j\") pod \"swift-operator-controller-manager-68fc8c869-5hw2c\" (UID: \"c42112c2-917b-491b-9a4a-5253a0fc8d09\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5hw2c" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.482784 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbtnw\" (UniqueName: \"kubernetes.io/projected/8ad3860f-1b60-4522-8026-08212156646d-kube-api-access-hbtnw\") pod \"telemetry-operator-controller-manager-64b5b76f97-8dwmb\" (UID: \"8ad3860f-1b60-4522-8026-08212156646d\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-8dwmb" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.482805 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9p5h\" (UniqueName: \"kubernetes.io/projected/d6255583-1dd4-4901-b3af-8619aa03434b-kube-api-access-b9p5h\") pod \"test-operator-controller-manager-56f8bfcd9f-9d95x\" (UID: \"d6255583-1dd4-4901-b3af-8619aa03434b\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-9d95x" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.482838 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9fn8\" (UniqueName: \"kubernetes.io/projected/b4b9ea20-ea14-4c87-b40e-5767debc9f57-kube-api-access-b9fn8\") pod \"watcher-operator-controller-manager-564965969-sbgw8\" (UID: \"b4b9ea20-ea14-4c87-b40e-5767debc9f57\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-sbgw8" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.500664 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57jjj\" (UniqueName: \"kubernetes.io/projected/5bc18646-6e2c-41b5-8690-b6b7eda1a8cc-kube-api-access-57jjj\") pod \"placement-operator-controller-manager-5b964cf4cd-x6rmh\" (UID: \"5bc18646-6e2c-41b5-8690-b6b7eda1a8cc\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-x6rmh" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.501007 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4l8j\" (UniqueName: \"kubernetes.io/projected/c42112c2-917b-491b-9a4a-5253a0fc8d09-kube-api-access-v4l8j\") pod \"swift-operator-controller-manager-68fc8c869-5hw2c\" (UID: \"c42112c2-917b-491b-9a4a-5253a0fc8d09\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5hw2c" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.503261 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9p5h\" (UniqueName: \"kubernetes.io/projected/d6255583-1dd4-4901-b3af-8619aa03434b-kube-api-access-b9p5h\") pod \"test-operator-controller-manager-56f8bfcd9f-9d95x\" (UID: \"d6255583-1dd4-4901-b3af-8619aa03434b\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-9d95x" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.516782 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbtnw\" (UniqueName: \"kubernetes.io/projected/8ad3860f-1b60-4522-8026-08212156646d-kube-api-access-hbtnw\") pod \"telemetry-operator-controller-manager-64b5b76f97-8dwmb\" (UID: \"8ad3860f-1b60-4522-8026-08212156646d\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-8dwmb" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.518105 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.520977 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.543441 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.543600 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.543766 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-bhdcr" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.555787 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.569746 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-x86p2" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.580156 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-x6rmh" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.583430 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9fn8\" (UniqueName: \"kubernetes.io/projected/b4b9ea20-ea14-4c87-b40e-5767debc9f57-kube-api-access-b9fn8\") pod \"watcher-operator-controller-manager-564965969-sbgw8\" (UID: \"b4b9ea20-ea14-4c87-b40e-5767debc9f57\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-sbgw8" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.583465 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hww6k\" (UniqueName: \"kubernetes.io/projected/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-kube-api-access-hww6k\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.583506 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.583548 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.595320 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-t7nxg" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.609149 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9fn8\" (UniqueName: \"kubernetes.io/projected/b4b9ea20-ea14-4c87-b40e-5767debc9f57-kube-api-access-b9fn8\") pod \"watcher-operator-controller-manager-564965969-sbgw8\" (UID: \"b4b9ea20-ea14-4c87-b40e-5767debc9f57\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-sbgw8" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.620358 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5hw2c" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.633830 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pfgj4"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.634209 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-97h8r" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.634822 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pfgj4" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.647926 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-k9cz7" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.669791 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pfgj4"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.674196 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-8dwmb" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.684854 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.684916 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert\") pod \"infra-operator-controller-manager-79955696d6-rn96f\" (UID: \"5033c800-ef69-4228-a204-b66401c4725c\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.684965 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.685013 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hww6k\" (UniqueName: \"kubernetes.io/projected/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-kube-api-access-hww6k\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:12 crc kubenswrapper[4783]: E0131 09:16:12.685146 4783 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 31 09:16:12 crc kubenswrapper[4783]: E0131 09:16:12.685248 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs podName:6360a87f-2ddf-4c17-9f25-cff4e0f5e747 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:13.185221935 +0000 UTC m=+683.853905403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-jqzh7" (UID: "6360a87f-2ddf-4c17-9f25-cff4e0f5e747") : secret "webhook-server-cert" not found Jan 31 09:16:12 crc kubenswrapper[4783]: E0131 09:16:12.685381 4783 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 31 09:16:12 crc kubenswrapper[4783]: E0131 09:16:12.685437 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert podName:5033c800-ef69-4228-a204-b66401c4725c nodeName:}" failed. No retries permitted until 2026-01-31 09:16:13.68542118 +0000 UTC m=+684.354104648 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert") pod "infra-operator-controller-manager-79955696d6-rn96f" (UID: "5033c800-ef69-4228-a204-b66401c4725c") : secret "infra-operator-webhook-server-cert" not found Jan 31 09:16:12 crc kubenswrapper[4783]: E0131 09:16:12.685630 4783 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 31 09:16:12 crc kubenswrapper[4783]: E0131 09:16:12.685809 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs podName:6360a87f-2ddf-4c17-9f25-cff4e0f5e747 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:13.185780778 +0000 UTC m=+683.854464245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-jqzh7" (UID: "6360a87f-2ddf-4c17-9f25-cff4e0f5e747") : secret "metrics-server-cert" not found Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.692126 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-9d95x" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.699829 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hww6k\" (UniqueName: \"kubernetes.io/projected/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-kube-api-access-hww6k\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.766310 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-sbgw8" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.793991 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxzqb\" (UniqueName: \"kubernetes.io/projected/2d0fc101-8afd-4154-9741-d5d3520990fe-kube-api-access-mxzqb\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pfgj4\" (UID: \"2d0fc101-8afd-4154-9741-d5d3520990fe\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pfgj4" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.849460 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-gwkhk"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.863523 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-k9948"] Jan 31 09:16:12 crc kubenswrapper[4783]: W0131 09:16:12.872809 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ddb9dd0_fc57_4685_a7d5_778a4152ea58.slice/crio-9d965e17639ac1141ab62dd580be84c0bed64f9e2d8606bf2c140b458f8c8f0a WatchSource:0}: Error finding container 9d965e17639ac1141ab62dd580be84c0bed64f9e2d8606bf2c140b458f8c8f0a: Status 404 returned error can't find the container with id 9d965e17639ac1141ab62dd580be84c0bed64f9e2d8606bf2c140b458f8c8f0a Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.874535 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-vjgpf"] Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.901336 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxzqb\" (UniqueName: \"kubernetes.io/projected/2d0fc101-8afd-4154-9741-d5d3520990fe-kube-api-access-mxzqb\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pfgj4\" (UID: \"2d0fc101-8afd-4154-9741-d5d3520990fe\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pfgj4" Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.901439 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp\" (UID: \"10649b85-8b3a-44e2-9477-6f5821d232a7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" Jan 31 09:16:12 crc kubenswrapper[4783]: E0131 09:16:12.901598 4783 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:16:12 crc kubenswrapper[4783]: E0131 09:16:12.901645 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert podName:10649b85-8b3a-44e2-9477-6f5821d232a7 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:13.901629113 +0000 UTC m=+684.570312582 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" (UID: "10649b85-8b3a-44e2-9477-6f5821d232a7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:16:12 crc kubenswrapper[4783]: I0131 09:16:12.922610 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxzqb\" (UniqueName: \"kubernetes.io/projected/2d0fc101-8afd-4154-9741-d5d3520990fe-kube-api-access-mxzqb\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pfgj4\" (UID: \"2d0fc101-8afd-4154-9741-d5d3520990fe\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pfgj4" Jan 31 09:16:12 crc kubenswrapper[4783]: W0131 09:16:12.931588 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99914340_4708_4322_996f_7392f6fe6e02.slice/crio-207d65fd0c6b67fff372d7f3a81e9c06b0629df92485edb72634553f3a13af81 WatchSource:0}: Error finding container 207d65fd0c6b67fff372d7f3a81e9c06b0629df92485edb72634553f3a13af81: Status 404 returned error can't find the container with id 207d65fd0c6b67fff372d7f3a81e9c06b0629df92485edb72634553f3a13af81 Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.021245 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pfgj4" Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.058076 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-4nwdm"] Jan 31 09:16:13 crc kubenswrapper[4783]: W0131 09:16:13.059108 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1de23cab_104f_49ac_ab9f_3b1d08733ff9.slice/crio-36d216224fc5c00f84ab014b7d9abb530dd98dee280706de1fb120dfe62292f2 WatchSource:0}: Error finding container 36d216224fc5c00f84ab014b7d9abb530dd98dee280706de1fb120dfe62292f2: Status 404 returned error can't find the container with id 36d216224fc5c00f84ab014b7d9abb530dd98dee280706de1fb120dfe62292f2 Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.068592 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-btfdw"] Jan 31 09:16:13 crc kubenswrapper[4783]: W0131 09:16:13.071713 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4606683_7a0b_4a0f_ae81_3c6e598a36e6.slice/crio-229f078018c3c95f6a5a1e6ed864cbc5755cd31f6ff1a01a427d65e2b369d99c WatchSource:0}: Error finding container 229f078018c3c95f6a5a1e6ed864cbc5755cd31f6ff1a01a427d65e2b369d99c: Status 404 returned error can't find the container with id 229f078018c3c95f6a5a1e6ed864cbc5755cd31f6ff1a01a427d65e2b369d99c Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.185405 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-bxv5z"] Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.202272 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-ln9h6"] Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.207888 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.208074 4783 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.208093 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.208154 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs podName:6360a87f-2ddf-4c17-9f25-cff4e0f5e747 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:14.20811452 +0000 UTC m=+684.876797988 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-jqzh7" (UID: "6360a87f-2ddf-4c17-9f25-cff4e0f5e747") : secret "metrics-server-cert" not found Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.208353 4783 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.208450 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs podName:6360a87f-2ddf-4c17-9f25-cff4e0f5e747 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:14.208422371 +0000 UTC m=+684.877105839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-jqzh7" (UID: "6360a87f-2ddf-4c17-9f25-cff4e0f5e747") : secret "webhook-server-cert" not found Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.214630 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-82z66"] Jan 31 09:16:13 crc kubenswrapper[4783]: W0131 09:16:13.220441 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbcb9fbf_dab0_4029_b8a8_9e6f13bdf352.slice/crio-ee81fa19992073087c1bc0d426406d7292adeec9f351d390748094ff8b90a466 WatchSource:0}: Error finding container ee81fa19992073087c1bc0d426406d7292adeec9f351d390748094ff8b90a466: Status 404 returned error can't find the container with id ee81fa19992073087c1bc0d426406d7292adeec9f351d390748094ff8b90a466 Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.305373 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-tmf9x"] Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.314983 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-s842k"] Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.319347 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-fvml6"] Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.388988 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-8dwmb"] Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.395812 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-x86p2"] Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.399566 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-x6rmh"] Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.404147 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-sbgw8"] Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.409672 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-9d95x"] Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.409790 4783 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b9p5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56f8bfcd9f-9d95x_openstack-operators(d6255583-1dd4-4901-b3af-8619aa03434b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.410899 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-9d95x" podUID="d6255583-1dd4-4901-b3af-8619aa03434b" Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.412140 4783 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hbtnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-64b5b76f97-8dwmb_openstack-operators(8ad3860f-1b60-4522-8026-08212156646d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.412276 4783 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cp6ct,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-788c46999f-x86p2_openstack-operators(caf4a0bd-2f55-4185-a756-4a640cbfe8d3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.413310 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-8dwmb" podUID="8ad3860f-1b60-4522-8026-08212156646d" Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.413373 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-x86p2" podUID="caf4a0bd-2f55-4185-a756-4a640cbfe8d3" Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.416912 4783 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-57jjj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b964cf4cd-x6rmh_openstack-operators(5bc18646-6e2c-41b5-8690-b6b7eda1a8cc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.418276 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-x6rmh" podUID="5bc18646-6e2c-41b5-8690-b6b7eda1a8cc" Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.418432 4783 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v4l8j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68fc8c869-5hw2c_openstack-operators(c42112c2-917b-491b-9a4a-5253a0fc8d09): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.419496 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-5hw2c"] Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.419527 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5hw2c" podUID="c42112c2-917b-491b-9a4a-5253a0fc8d09" Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.514861 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-97h8r"] Jan 31 09:16:13 crc kubenswrapper[4783]: W0131 09:16:13.516917 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fdee142_92ef_49d1_bac6_6f6c3873b2cb.slice/crio-0811f4ec71ed068af351e7e8156af9f71362cac721b765652467f22aa8b6b9f8 WatchSource:0}: Error finding container 0811f4ec71ed068af351e7e8156af9f71362cac721b765652467f22aa8b6b9f8: Status 404 returned error can't find the container with id 0811f4ec71ed068af351e7e8156af9f71362cac721b765652467f22aa8b6b9f8 Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.549684 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-t7nxg"] Jan 31 09:16:13 crc kubenswrapper[4783]: W0131 09:16:13.550927 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd28073a_c4f4_4b1d_9680_a9d5a5939deb.slice/crio-e1bebe8a5812ea8377f3f0d7f157ea36d050a1437ba901b9f358023926589214 WatchSource:0}: Error finding container e1bebe8a5812ea8377f3f0d7f157ea36d050a1437ba901b9f358023926589214: Status 404 returned error can't find the container with id e1bebe8a5812ea8377f3f0d7f157ea36d050a1437ba901b9f358023926589214 Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.555157 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pfgj4"] Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.555664 4783 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vr5lq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-84f48565d4-t7nxg_openstack-operators(cd28073a-c4f4-4b1d-9680-a9d5a5939deb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 31 09:16:13 crc kubenswrapper[4783]: W0131 09:16:13.555923 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d0fc101_8afd_4154_9741_d5d3520990fe.slice/crio-70cbbda3011650d348b63ef7b2199ee73735c522bc476b8f96fbc85f2111c6ca WatchSource:0}: Error finding container 70cbbda3011650d348b63ef7b2199ee73735c522bc476b8f96fbc85f2111c6ca: Status 404 returned error can't find the container with id 70cbbda3011650d348b63ef7b2199ee73735c522bc476b8f96fbc85f2111c6ca Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.557707 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-t7nxg" podUID="cd28073a-c4f4-4b1d-9680-a9d5a5939deb" Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.558499 4783 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mxzqb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-pfgj4_openstack-operators(2d0fc101-8afd-4154-9741-d5d3520990fe): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.559795 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pfgj4" podUID="2d0fc101-8afd-4154-9741-d5d3520990fe" Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.717065 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert\") pod \"infra-operator-controller-manager-79955696d6-rn96f\" (UID: \"5033c800-ef69-4228-a204-b66401c4725c\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.717353 4783 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.717454 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert podName:5033c800-ef69-4228-a204-b66401c4725c nodeName:}" failed. No retries permitted until 2026-01-31 09:16:15.717425517 +0000 UTC m=+686.386108985 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert") pod "infra-operator-controller-manager-79955696d6-rn96f" (UID: "5033c800-ef69-4228-a204-b66401c4725c") : secret "infra-operator-webhook-server-cert" not found Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.890322 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-ln9h6" event={"ID":"1e1e07c3-0aeb-47fd-be71-a13716a04f29","Type":"ContainerStarted","Data":"669ce6a2003b691f1b89e71c3d822fe303108736c3fa88243de0f127a9049a2a"} Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.891979 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-t7nxg" event={"ID":"cd28073a-c4f4-4b1d-9680-a9d5a5939deb","Type":"ContainerStarted","Data":"e1bebe8a5812ea8377f3f0d7f157ea36d050a1437ba901b9f358023926589214"} Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.893876 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-t7nxg" podUID="cd28073a-c4f4-4b1d-9680-a9d5a5939deb" Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.894338 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5hw2c" event={"ID":"c42112c2-917b-491b-9a4a-5253a0fc8d09","Type":"ContainerStarted","Data":"e37e8bf24dc9f97d67faf35934f7c687691e4f2d2b829a49d71a62261ef8329f"} Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.896377 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5hw2c" podUID="c42112c2-917b-491b-9a4a-5253a0fc8d09" Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.896499 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-9d95x" event={"ID":"d6255583-1dd4-4901-b3af-8619aa03434b","Type":"ContainerStarted","Data":"f72d9a3d7b7702ddadc3e1f39d66f39620c449e96c19d41965afc4319d5ae28c"} Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.897620 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-9d95x" podUID="d6255583-1dd4-4901-b3af-8619aa03434b" Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.898367 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-k9948" event={"ID":"7ddb9dd0-fc57-4685-a7d5-778a4152ea58","Type":"ContainerStarted","Data":"9d965e17639ac1141ab62dd580be84c0bed64f9e2d8606bf2c140b458f8c8f0a"} Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.900193 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-btfdw" event={"ID":"e4606683-7a0b-4a0f-ae81-3c6e598a36e6","Type":"ContainerStarted","Data":"229f078018c3c95f6a5a1e6ed864cbc5755cd31f6ff1a01a427d65e2b369d99c"} Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.901538 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-tmf9x" event={"ID":"8178283d-c10c-45e6-a465-bdb5096d8904","Type":"ContainerStarted","Data":"f873e9c8976737e6a3ec9939000d2548cd184cf5e379254058b9ae450b7e1590"} Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.902917 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-bxv5z" event={"ID":"acb9dbfe-e754-4021-bc54-7ccd17b217a4","Type":"ContainerStarted","Data":"a26681d2a0729fbbead805efe3a89e7a2056572fc5cb3a8648532e8f0303664f"} Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.904578 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-97h8r" event={"ID":"8fdee142-92ef-49d1-bac6-6f6c3873b2cb","Type":"ContainerStarted","Data":"0811f4ec71ed068af351e7e8156af9f71362cac721b765652467f22aa8b6b9f8"} Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.910626 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-8dwmb" event={"ID":"8ad3860f-1b60-4522-8026-08212156646d","Type":"ContainerStarted","Data":"5b47988bc26bcbcd26b0d68e739aaee765d704d0dbc47d56bc90e440588ce0c6"} Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.913364 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-8dwmb" podUID="8ad3860f-1b60-4522-8026-08212156646d" Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.913956 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-vjgpf" event={"ID":"99914340-4708-4322-996f-7392f6fe6e02","Type":"ContainerStarted","Data":"207d65fd0c6b67fff372d7f3a81e9c06b0629df92485edb72634553f3a13af81"} Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.916510 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-82z66" event={"ID":"bbcb9fbf-dab0-4029-b8a8-9e6f13bdf352","Type":"ContainerStarted","Data":"ee81fa19992073087c1bc0d426406d7292adeec9f351d390748094ff8b90a466"} Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.919625 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-x6rmh" event={"ID":"5bc18646-6e2c-41b5-8690-b6b7eda1a8cc","Type":"ContainerStarted","Data":"d31fb9a503e8e5c4971b09600d4c512870b905dd0109c78ab93ac02c65c99a89"} Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.920482 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp\" (UID: \"10649b85-8b3a-44e2-9477-6f5821d232a7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.920512 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-x6rmh" podUID="5bc18646-6e2c-41b5-8690-b6b7eda1a8cc" Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.920624 4783 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.920666 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s842k" event={"ID":"fa110500-5ebd-4645-86a1-e3bf4b9780fe","Type":"ContainerStarted","Data":"a3cfd989133d16c49c3fa54ee18ba0479881a95eedacbf093b612edb52da619d"} Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.920680 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert podName:10649b85-8b3a-44e2-9477-6f5821d232a7 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:15.920657932 +0000 UTC m=+686.589341400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" (UID: "10649b85-8b3a-44e2-9477-6f5821d232a7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.923015 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-sbgw8" event={"ID":"b4b9ea20-ea14-4c87-b40e-5767debc9f57","Type":"ContainerStarted","Data":"f425517361edfc409e7494a63292fa6f8876391793f4ba386b23aa54b43a4df5"} Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.926842 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-x86p2" event={"ID":"caf4a0bd-2f55-4185-a756-4a640cbfe8d3","Type":"ContainerStarted","Data":"58a8173fa70508942139cf7f22c5de669b428a782fb6d8fc777df1c57bd7ba79"} Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.927562 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-x86p2" podUID="caf4a0bd-2f55-4185-a756-4a640cbfe8d3" Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.928382 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-fvml6" event={"ID":"be9f1345-8ca5-49da-a52e-4b841ea07ac3","Type":"ContainerStarted","Data":"f62bbac09602eca442fb95ac49cbcdd1e4927ce830ea69f1691b3b06f008611a"} Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.930126 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4nwdm" event={"ID":"1de23cab-104f-49ac-ab9f-3b1d08733ff9","Type":"ContainerStarted","Data":"36d216224fc5c00f84ab014b7d9abb530dd98dee280706de1fb120dfe62292f2"} Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.933310 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pfgj4" event={"ID":"2d0fc101-8afd-4154-9741-d5d3520990fe","Type":"ContainerStarted","Data":"70cbbda3011650d348b63ef7b2199ee73735c522bc476b8f96fbc85f2111c6ca"} Jan 31 09:16:13 crc kubenswrapper[4783]: I0131 09:16:13.939030 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-gwkhk" event={"ID":"3a079322-76ea-4cb9-a8b6-3f0b1a360086","Type":"ContainerStarted","Data":"9136b4a4be560c2e0df9b3bb2f14a3389f8b0ec081311c61b9a68692401ee488"} Jan 31 09:16:13 crc kubenswrapper[4783]: E0131 09:16:13.940479 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pfgj4" podUID="2d0fc101-8afd-4154-9741-d5d3520990fe" Jan 31 09:16:14 crc kubenswrapper[4783]: I0131 09:16:14.223835 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:14 crc kubenswrapper[4783]: I0131 09:16:14.224207 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:14 crc kubenswrapper[4783]: E0131 09:16:14.224215 4783 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 31 09:16:14 crc kubenswrapper[4783]: E0131 09:16:14.224368 4783 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 31 09:16:14 crc kubenswrapper[4783]: E0131 09:16:14.224587 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs podName:6360a87f-2ddf-4c17-9f25-cff4e0f5e747 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:16.224550653 +0000 UTC m=+686.893234121 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-jqzh7" (UID: "6360a87f-2ddf-4c17-9f25-cff4e0f5e747") : secret "webhook-server-cert" not found Jan 31 09:16:14 crc kubenswrapper[4783]: E0131 09:16:14.224608 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs podName:6360a87f-2ddf-4c17-9f25-cff4e0f5e747 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:16.224600347 +0000 UTC m=+686.893283815 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-jqzh7" (UID: "6360a87f-2ddf-4c17-9f25-cff4e0f5e747") : secret "metrics-server-cert" not found Jan 31 09:16:14 crc kubenswrapper[4783]: E0131 09:16:14.961489 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pfgj4" podUID="2d0fc101-8afd-4154-9741-d5d3520990fe" Jan 31 09:16:14 crc kubenswrapper[4783]: E0131 09:16:14.965462 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5hw2c" podUID="c42112c2-917b-491b-9a4a-5253a0fc8d09" Jan 31 09:16:14 crc kubenswrapper[4783]: E0131 09:16:14.965528 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-x6rmh" podUID="5bc18646-6e2c-41b5-8690-b6b7eda1a8cc" Jan 31 09:16:14 crc kubenswrapper[4783]: E0131 09:16:14.965577 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-t7nxg" podUID="cd28073a-c4f4-4b1d-9680-a9d5a5939deb" Jan 31 09:16:14 crc kubenswrapper[4783]: E0131 09:16:14.965620 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-x86p2" podUID="caf4a0bd-2f55-4185-a756-4a640cbfe8d3" Jan 31 09:16:14 crc kubenswrapper[4783]: E0131 09:16:14.965663 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-8dwmb" podUID="8ad3860f-1b60-4522-8026-08212156646d" Jan 31 09:16:14 crc kubenswrapper[4783]: E0131 09:16:14.965697 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-9d95x" podUID="d6255583-1dd4-4901-b3af-8619aa03434b" Jan 31 09:16:15 crc kubenswrapper[4783]: I0131 09:16:15.740760 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert\") pod \"infra-operator-controller-manager-79955696d6-rn96f\" (UID: \"5033c800-ef69-4228-a204-b66401c4725c\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" Jan 31 09:16:15 crc kubenswrapper[4783]: E0131 09:16:15.740940 4783 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 31 09:16:15 crc kubenswrapper[4783]: E0131 09:16:15.740981 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert podName:5033c800-ef69-4228-a204-b66401c4725c nodeName:}" failed. No retries permitted until 2026-01-31 09:16:19.740969685 +0000 UTC m=+690.409653154 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert") pod "infra-operator-controller-manager-79955696d6-rn96f" (UID: "5033c800-ef69-4228-a204-b66401c4725c") : secret "infra-operator-webhook-server-cert" not found Jan 31 09:16:15 crc kubenswrapper[4783]: I0131 09:16:15.942704 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp\" (UID: \"10649b85-8b3a-44e2-9477-6f5821d232a7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" Jan 31 09:16:15 crc kubenswrapper[4783]: E0131 09:16:15.942858 4783 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:16:15 crc kubenswrapper[4783]: E0131 09:16:15.942917 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert podName:10649b85-8b3a-44e2-9477-6f5821d232a7 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:19.942906339 +0000 UTC m=+690.611589808 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" (UID: "10649b85-8b3a-44e2-9477-6f5821d232a7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:16:16 crc kubenswrapper[4783]: I0131 09:16:16.245420 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:16 crc kubenswrapper[4783]: I0131 09:16:16.245491 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:16 crc kubenswrapper[4783]: E0131 09:16:16.245570 4783 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 31 09:16:16 crc kubenswrapper[4783]: E0131 09:16:16.245618 4783 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 31 09:16:16 crc kubenswrapper[4783]: E0131 09:16:16.245624 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs podName:6360a87f-2ddf-4c17-9f25-cff4e0f5e747 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:20.24560975 +0000 UTC m=+690.914293218 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-jqzh7" (UID: "6360a87f-2ddf-4c17-9f25-cff4e0f5e747") : secret "webhook-server-cert" not found Jan 31 09:16:16 crc kubenswrapper[4783]: E0131 09:16:16.245670 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs podName:6360a87f-2ddf-4c17-9f25-cff4e0f5e747 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:20.245657209 +0000 UTC m=+690.914340668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-jqzh7" (UID: "6360a87f-2ddf-4c17-9f25-cff4e0f5e747") : secret "metrics-server-cert" not found Jan 31 09:16:17 crc kubenswrapper[4783]: I0131 09:16:17.756769 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:16:17 crc kubenswrapper[4783]: I0131 09:16:17.757332 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:16:19 crc kubenswrapper[4783]: I0131 09:16:19.799661 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert\") pod \"infra-operator-controller-manager-79955696d6-rn96f\" (UID: \"5033c800-ef69-4228-a204-b66401c4725c\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" Jan 31 09:16:19 crc kubenswrapper[4783]: E0131 09:16:19.799943 4783 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 31 09:16:19 crc kubenswrapper[4783]: E0131 09:16:19.800075 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert podName:5033c800-ef69-4228-a204-b66401c4725c nodeName:}" failed. No retries permitted until 2026-01-31 09:16:27.800052964 +0000 UTC m=+698.468736433 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert") pod "infra-operator-controller-manager-79955696d6-rn96f" (UID: "5033c800-ef69-4228-a204-b66401c4725c") : secret "infra-operator-webhook-server-cert" not found Jan 31 09:16:20 crc kubenswrapper[4783]: I0131 09:16:20.004788 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp\" (UID: \"10649b85-8b3a-44e2-9477-6f5821d232a7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" Jan 31 09:16:20 crc kubenswrapper[4783]: E0131 09:16:20.005080 4783 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:16:20 crc kubenswrapper[4783]: E0131 09:16:20.005438 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert podName:10649b85-8b3a-44e2-9477-6f5821d232a7 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:28.005409973 +0000 UTC m=+698.674093432 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" (UID: "10649b85-8b3a-44e2-9477-6f5821d232a7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:16:20 crc kubenswrapper[4783]: I0131 09:16:20.310185 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:20 crc kubenswrapper[4783]: I0131 09:16:20.310276 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:20 crc kubenswrapper[4783]: E0131 09:16:20.310355 4783 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 31 09:16:20 crc kubenswrapper[4783]: E0131 09:16:20.310453 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs podName:6360a87f-2ddf-4c17-9f25-cff4e0f5e747 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:28.310429739 +0000 UTC m=+698.979113207 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-jqzh7" (UID: "6360a87f-2ddf-4c17-9f25-cff4e0f5e747") : secret "webhook-server-cert" not found Jan 31 09:16:20 crc kubenswrapper[4783]: E0131 09:16:20.310445 4783 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 31 09:16:20 crc kubenswrapper[4783]: E0131 09:16:20.310492 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs podName:6360a87f-2ddf-4c17-9f25-cff4e0f5e747 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:28.310483901 +0000 UTC m=+698.979167369 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-jqzh7" (UID: "6360a87f-2ddf-4c17-9f25-cff4e0f5e747") : secret "metrics-server-cert" not found Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.008134 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-sbgw8" event={"ID":"b4b9ea20-ea14-4c87-b40e-5767debc9f57","Type":"ContainerStarted","Data":"8a137dfa6c66d170a0461ef0ff6ee6e5c54adb5c988703d5453f7ac02f0e7da7"} Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.008736 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-564965969-sbgw8" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.010399 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-vjgpf" event={"ID":"99914340-4708-4322-996f-7392f6fe6e02","Type":"ContainerStarted","Data":"c758aba61a5b7d61ec21505468acd71ec72e553e14ba5caa4ffcaa422121d100"} Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.010581 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-vjgpf" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.012427 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-btfdw" event={"ID":"e4606683-7a0b-4a0f-ae81-3c6e598a36e6","Type":"ContainerStarted","Data":"c3c56fe58348ebac5d84ba8b9cc7953130956d96ec8d56c2ec32710398836427"} Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.012518 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-btfdw" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.013490 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-ln9h6" event={"ID":"1e1e07c3-0aeb-47fd-be71-a13716a04f29","Type":"ContainerStarted","Data":"48204aefe0f11fdc6dfc52bd5d17fbcef55dd7a13f479885a95a99abe5e16477"} Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.014217 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-ln9h6" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.015458 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s842k" event={"ID":"fa110500-5ebd-4645-86a1-e3bf4b9780fe","Type":"ContainerStarted","Data":"21d0b190cb4ca3d99c18a8a0267fac96da7a48302c44329f4bc5e674fd4c2f13"} Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.015822 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s842k" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.016887 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-fvml6" event={"ID":"be9f1345-8ca5-49da-a52e-4b841ea07ac3","Type":"ContainerStarted","Data":"781a8768e694d8205f8a1d6f1409dbb6ecf3df4ff4c378207fc3231530b53f2e"} Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.017253 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-fvml6" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.018983 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-bxv5z" event={"ID":"acb9dbfe-e754-4021-bc54-7ccd17b217a4","Type":"ContainerStarted","Data":"2c7462cc8f019c5f475c2cc8b7d2011e8e67d808be777e5ebd81782f2fdbb37d"} Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.019415 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-bxv5z" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.021659 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4nwdm" event={"ID":"1de23cab-104f-49ac-ab9f-3b1d08733ff9","Type":"ContainerStarted","Data":"b4caf6b38b51c78d6e61fa309d8adffa21d6e12ec2580c1a32042960322ae464"} Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.022017 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4nwdm" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.026967 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-tmf9x" event={"ID":"8178283d-c10c-45e6-a465-bdb5096d8904","Type":"ContainerStarted","Data":"f2645a16825af019cb061bf95461130d3466f4296f8abf67dd6723ce7e8f372c"} Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.027374 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-tmf9x" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.028838 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-gwkhk" event={"ID":"3a079322-76ea-4cb9-a8b6-3f0b1a360086","Type":"ContainerStarted","Data":"52696daf460daea628330a53132840ab2ca00bdd0f6f0655ce9ac737f5507f43"} Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.028942 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-564965969-sbgw8" podStartSLOduration=1.798046534 podStartE2EDuration="11.028931303s" podCreationTimestamp="2026-01-31 09:16:12 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.404293476 +0000 UTC m=+684.072976944" lastFinishedPulling="2026-01-31 09:16:22.635178244 +0000 UTC m=+693.303861713" observedRunningTime="2026-01-31 09:16:23.027527108 +0000 UTC m=+693.696210576" watchObservedRunningTime="2026-01-31 09:16:23.028931303 +0000 UTC m=+693.697614771" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.029209 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-gwkhk" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.031508 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-82z66" event={"ID":"bbcb9fbf-dab0-4029-b8a8-9e6f13bdf352","Type":"ContainerStarted","Data":"bf8de2b86064c1d2d62a3eebf3f5d3ac9712931ff7dbbf039a1be74a1993746d"} Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.031844 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-82z66" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.034189 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-97h8r" event={"ID":"8fdee142-92ef-49d1-bac6-6f6c3873b2cb","Type":"ContainerStarted","Data":"f5691bc0336422e593a2649e788329ead571684f03cf864dd80f8e3dd62ed7d4"} Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.034539 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-97h8r" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.039347 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-k9948" event={"ID":"7ddb9dd0-fc57-4685-a7d5-778a4152ea58","Type":"ContainerStarted","Data":"c39d2d7ce0927fe6b0c036eb103f35ff627ab5e47821e1137d3b4153705ec423"} Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.039584 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-k9948" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.044035 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-ln9h6" podStartSLOduration=2.641306271 podStartE2EDuration="12.044022158s" podCreationTimestamp="2026-01-31 09:16:11 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.220720132 +0000 UTC m=+683.889403600" lastFinishedPulling="2026-01-31 09:16:22.623436019 +0000 UTC m=+693.292119487" observedRunningTime="2026-01-31 09:16:23.038680302 +0000 UTC m=+693.707363770" watchObservedRunningTime="2026-01-31 09:16:23.044022158 +0000 UTC m=+693.712705626" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.069477 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-vjgpf" podStartSLOduration=2.369057914 podStartE2EDuration="12.069461321s" podCreationTimestamp="2026-01-31 09:16:11 +0000 UTC" firstStartedPulling="2026-01-31 09:16:12.937983724 +0000 UTC m=+683.606667192" lastFinishedPulling="2026-01-31 09:16:22.638387131 +0000 UTC m=+693.307070599" observedRunningTime="2026-01-31 09:16:23.063526769 +0000 UTC m=+693.732210236" watchObservedRunningTime="2026-01-31 09:16:23.069461321 +0000 UTC m=+693.738144789" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.077101 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s842k" podStartSLOduration=2.768376399 podStartE2EDuration="12.077086778s" podCreationTimestamp="2026-01-31 09:16:11 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.315189313 +0000 UTC m=+683.983872781" lastFinishedPulling="2026-01-31 09:16:22.623899693 +0000 UTC m=+693.292583160" observedRunningTime="2026-01-31 09:16:23.076093387 +0000 UTC m=+693.744776855" watchObservedRunningTime="2026-01-31 09:16:23.077086778 +0000 UTC m=+693.745770246" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.097401 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-tmf9x" podStartSLOduration=2.760175787 podStartE2EDuration="12.097378232s" podCreationTimestamp="2026-01-31 09:16:11 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.301655513 +0000 UTC m=+683.970338981" lastFinishedPulling="2026-01-31 09:16:22.638857958 +0000 UTC m=+693.307541426" observedRunningTime="2026-01-31 09:16:23.090410444 +0000 UTC m=+693.759093912" watchObservedRunningTime="2026-01-31 09:16:23.097378232 +0000 UTC m=+693.766061700" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.122533 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-bxv5z" podStartSLOduration=2.684316602 podStartE2EDuration="12.122509746s" podCreationTimestamp="2026-01-31 09:16:11 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.199121045 +0000 UTC m=+683.867804513" lastFinishedPulling="2026-01-31 09:16:22.637314188 +0000 UTC m=+693.305997657" observedRunningTime="2026-01-31 09:16:23.116109626 +0000 UTC m=+693.784793095" watchObservedRunningTime="2026-01-31 09:16:23.122509746 +0000 UTC m=+693.791193214" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.135303 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-fvml6" podStartSLOduration=2.832543994 podStartE2EDuration="12.135289815s" podCreationTimestamp="2026-01-31 09:16:11 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.304014428 +0000 UTC m=+683.972697896" lastFinishedPulling="2026-01-31 09:16:22.606760249 +0000 UTC m=+693.275443717" observedRunningTime="2026-01-31 09:16:23.131938522 +0000 UTC m=+693.800621990" watchObservedRunningTime="2026-01-31 09:16:23.135289815 +0000 UTC m=+693.803973284" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.182244 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4nwdm" podStartSLOduration=2.607072756 podStartE2EDuration="12.182220664s" podCreationTimestamp="2026-01-31 09:16:11 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.061606296 +0000 UTC m=+683.730289764" lastFinishedPulling="2026-01-31 09:16:22.636754203 +0000 UTC m=+693.305437672" observedRunningTime="2026-01-31 09:16:23.169216913 +0000 UTC m=+693.837900381" watchObservedRunningTime="2026-01-31 09:16:23.182220664 +0000 UTC m=+693.850904133" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.194098 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-btfdw" podStartSLOduration=2.645297521 podStartE2EDuration="12.194080762s" podCreationTimestamp="2026-01-31 09:16:11 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.075326478 +0000 UTC m=+683.744009946" lastFinishedPulling="2026-01-31 09:16:22.624109718 +0000 UTC m=+693.292793187" observedRunningTime="2026-01-31 09:16:23.193440405 +0000 UTC m=+693.862123873" watchObservedRunningTime="2026-01-31 09:16:23.194080762 +0000 UTC m=+693.862764230" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.238695 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-gwkhk" podStartSLOduration=2.501239162 podStartE2EDuration="12.238672452s" podCreationTimestamp="2026-01-31 09:16:11 +0000 UTC" firstStartedPulling="2026-01-31 09:16:12.898239577 +0000 UTC m=+683.566923044" lastFinishedPulling="2026-01-31 09:16:22.635672866 +0000 UTC m=+693.304356334" observedRunningTime="2026-01-31 09:16:23.238409006 +0000 UTC m=+693.907092464" watchObservedRunningTime="2026-01-31 09:16:23.238672452 +0000 UTC m=+693.907355920" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.241626 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-82z66" podStartSLOduration=4.734376934 podStartE2EDuration="12.24162186s" podCreationTimestamp="2026-01-31 09:16:11 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.223127319 +0000 UTC m=+683.891810787" lastFinishedPulling="2026-01-31 09:16:20.730372245 +0000 UTC m=+691.399055713" observedRunningTime="2026-01-31 09:16:23.219650691 +0000 UTC m=+693.888334159" watchObservedRunningTime="2026-01-31 09:16:23.24162186 +0000 UTC m=+693.910305328" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.319216 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-97h8r" podStartSLOduration=3.215461148 podStartE2EDuration="12.319201968s" podCreationTimestamp="2026-01-31 09:16:11 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.520096384 +0000 UTC m=+684.188779853" lastFinishedPulling="2026-01-31 09:16:22.623837205 +0000 UTC m=+693.292520673" observedRunningTime="2026-01-31 09:16:23.314602742 +0000 UTC m=+693.983286210" watchObservedRunningTime="2026-01-31 09:16:23.319201968 +0000 UTC m=+693.987885436" Jan 31 09:16:23 crc kubenswrapper[4783]: I0131 09:16:23.319585 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-k9948" podStartSLOduration=2.593634587 podStartE2EDuration="12.319579108s" podCreationTimestamp="2026-01-31 09:16:11 +0000 UTC" firstStartedPulling="2026-01-31 09:16:12.89876653 +0000 UTC m=+683.567449998" lastFinishedPulling="2026-01-31 09:16:22.624711051 +0000 UTC m=+693.293394519" observedRunningTime="2026-01-31 09:16:23.278050259 +0000 UTC m=+693.946733727" watchObservedRunningTime="2026-01-31 09:16:23.319579108 +0000 UTC m=+693.988262566" Jan 31 09:16:27 crc kubenswrapper[4783]: I0131 09:16:27.829011 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert\") pod \"infra-operator-controller-manager-79955696d6-rn96f\" (UID: \"5033c800-ef69-4228-a204-b66401c4725c\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" Jan 31 09:16:27 crc kubenswrapper[4783]: E0131 09:16:27.829219 4783 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 31 09:16:27 crc kubenswrapper[4783]: E0131 09:16:27.829660 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert podName:5033c800-ef69-4228-a204-b66401c4725c nodeName:}" failed. No retries permitted until 2026-01-31 09:16:43.829644418 +0000 UTC m=+714.498327877 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert") pod "infra-operator-controller-manager-79955696d6-rn96f" (UID: "5033c800-ef69-4228-a204-b66401c4725c") : secret "infra-operator-webhook-server-cert" not found Jan 31 09:16:28 crc kubenswrapper[4783]: I0131 09:16:28.031512 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp\" (UID: \"10649b85-8b3a-44e2-9477-6f5821d232a7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" Jan 31 09:16:28 crc kubenswrapper[4783]: E0131 09:16:28.031699 4783 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:16:28 crc kubenswrapper[4783]: E0131 09:16:28.032572 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert podName:10649b85-8b3a-44e2-9477-6f5821d232a7 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:44.032548224 +0000 UTC m=+714.701231692 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" (UID: "10649b85-8b3a-44e2-9477-6f5821d232a7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:16:28 crc kubenswrapper[4783]: I0131 09:16:28.337241 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:28 crc kubenswrapper[4783]: E0131 09:16:28.337450 4783 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 31 09:16:28 crc kubenswrapper[4783]: E0131 09:16:28.337536 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs podName:6360a87f-2ddf-4c17-9f25-cff4e0f5e747 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:44.337514399 +0000 UTC m=+715.006197868 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-jqzh7" (UID: "6360a87f-2ddf-4c17-9f25-cff4e0f5e747") : secret "webhook-server-cert" not found Jan 31 09:16:28 crc kubenswrapper[4783]: I0131 09:16:28.337470 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:28 crc kubenswrapper[4783]: E0131 09:16:28.337622 4783 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 31 09:16:28 crc kubenswrapper[4783]: E0131 09:16:28.337741 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs podName:6360a87f-2ddf-4c17-9f25-cff4e0f5e747 nodeName:}" failed. No retries permitted until 2026-01-31 09:16:44.337717511 +0000 UTC m=+715.006400979 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-jqzh7" (UID: "6360a87f-2ddf-4c17-9f25-cff4e0f5e747") : secret "metrics-server-cert" not found Jan 31 09:16:29 crc kubenswrapper[4783]: I0131 09:16:29.081449 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pfgj4" event={"ID":"2d0fc101-8afd-4154-9741-d5d3520990fe","Type":"ContainerStarted","Data":"f2bee08cd5e30a1d4819debbc4784a0f4a73122992fec41d724035fda7b08e72"} Jan 31 09:16:29 crc kubenswrapper[4783]: I0131 09:16:29.096197 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pfgj4" podStartSLOduration=1.956582911 podStartE2EDuration="17.096181046s" podCreationTimestamp="2026-01-31 09:16:12 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.558383425 +0000 UTC m=+684.227066893" lastFinishedPulling="2026-01-31 09:16:28.697981559 +0000 UTC m=+699.366665028" observedRunningTime="2026-01-31 09:16:29.093041642 +0000 UTC m=+699.761725109" watchObservedRunningTime="2026-01-31 09:16:29.096181046 +0000 UTC m=+699.764864514" Jan 31 09:16:32 crc kubenswrapper[4783]: I0131 09:16:32.135503 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-gwkhk" Jan 31 09:16:32 crc kubenswrapper[4783]: I0131 09:16:32.172550 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-vjgpf" Jan 31 09:16:32 crc kubenswrapper[4783]: I0131 09:16:32.193374 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-k9948" Jan 31 09:16:32 crc kubenswrapper[4783]: I0131 09:16:32.243186 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-btfdw" Jan 31 09:16:32 crc kubenswrapper[4783]: I0131 09:16:32.250804 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-4nwdm" Jan 31 09:16:32 crc kubenswrapper[4783]: I0131 09:16:32.264752 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-ln9h6" Jan 31 09:16:32 crc kubenswrapper[4783]: I0131 09:16:32.287382 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-82z66" Jan 31 09:16:32 crc kubenswrapper[4783]: I0131 09:16:32.363514 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-bxv5z" Jan 31 09:16:32 crc kubenswrapper[4783]: I0131 09:16:32.383411 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-fvml6" Jan 31 09:16:32 crc kubenswrapper[4783]: I0131 09:16:32.383738 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-tmf9x" Jan 31 09:16:32 crc kubenswrapper[4783]: I0131 09:16:32.443969 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s842k" Jan 31 09:16:32 crc kubenswrapper[4783]: I0131 09:16:32.636415 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-97h8r" Jan 31 09:16:32 crc kubenswrapper[4783]: I0131 09:16:32.770312 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-564965969-sbgw8" Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.132807 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-x86p2" event={"ID":"caf4a0bd-2f55-4185-a756-4a640cbfe8d3","Type":"ContainerStarted","Data":"3bc2f333da552518fb45d923fbc03ec0685d243a9a54a57ed37e27ce86d04233"} Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.133799 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-x86p2" Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.135004 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-8dwmb" event={"ID":"8ad3860f-1b60-4522-8026-08212156646d","Type":"ContainerStarted","Data":"f0d411438faa52076c99fafab80fbfdcfa9c20f9836c9930f6ee7e7cf96e9162"} Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.135183 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-8dwmb" Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.136488 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-x6rmh" event={"ID":"5bc18646-6e2c-41b5-8690-b6b7eda1a8cc","Type":"ContainerStarted","Data":"29d92e60ff97685668057908f0c6bab939fb2c71a8b61be88fac9d11e6e9043a"} Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.136670 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-x6rmh" Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.137573 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-t7nxg" event={"ID":"cd28073a-c4f4-4b1d-9680-a9d5a5939deb","Type":"ContainerStarted","Data":"2c8d46decda1ad981d43c529a21c46bfec3f33ba135e752336066e7209ac865c"} Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.137707 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-t7nxg" Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.138786 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5hw2c" event={"ID":"c42112c2-917b-491b-9a4a-5253a0fc8d09","Type":"ContainerStarted","Data":"ae9429cae0e61f7d330d8a466b598fcf4d4a8494ddba48da0b9437ec5e24f0d0"} Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.138962 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5hw2c" Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.140067 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-9d95x" event={"ID":"d6255583-1dd4-4901-b3af-8619aa03434b","Type":"ContainerStarted","Data":"8029c271493e06e95d6c51eb88dec19be6c0fe083523c0f77cbbf7b2c576672d"} Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.140212 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-9d95x" Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.149279 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-x86p2" podStartSLOduration=1.909848333 podStartE2EDuration="22.149268682s" podCreationTimestamp="2026-01-31 09:16:12 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.412223157 +0000 UTC m=+684.080906624" lastFinishedPulling="2026-01-31 09:16:33.651643506 +0000 UTC m=+704.320326973" observedRunningTime="2026-01-31 09:16:34.147853626 +0000 UTC m=+704.816537095" watchObservedRunningTime="2026-01-31 09:16:34.149268682 +0000 UTC m=+704.817952151" Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.162385 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-8dwmb" podStartSLOduration=1.926385034 podStartE2EDuration="22.162374617s" podCreationTimestamp="2026-01-31 09:16:12 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.41205465 +0000 UTC m=+684.080738118" lastFinishedPulling="2026-01-31 09:16:33.648044233 +0000 UTC m=+704.316727701" observedRunningTime="2026-01-31 09:16:34.160897864 +0000 UTC m=+704.829581332" watchObservedRunningTime="2026-01-31 09:16:34.162374617 +0000 UTC m=+704.831058085" Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.181555 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-9d95x" podStartSLOduration=3.395378676 podStartE2EDuration="22.181548435s" podCreationTimestamp="2026-01-31 09:16:12 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.408149692 +0000 UTC m=+684.076833160" lastFinishedPulling="2026-01-31 09:16:32.194319451 +0000 UTC m=+702.863002919" observedRunningTime="2026-01-31 09:16:34.174973756 +0000 UTC m=+704.843657224" watchObservedRunningTime="2026-01-31 09:16:34.181548435 +0000 UTC m=+704.850231903" Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.254572 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5hw2c" podStartSLOduration=2.004569921 podStartE2EDuration="22.254558321s" podCreationTimestamp="2026-01-31 09:16:12 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.418336566 +0000 UTC m=+684.087020035" lastFinishedPulling="2026-01-31 09:16:33.668324966 +0000 UTC m=+704.337008435" observedRunningTime="2026-01-31 09:16:34.211395132 +0000 UTC m=+704.880078601" watchObservedRunningTime="2026-01-31 09:16:34.254558321 +0000 UTC m=+704.923241790" Jan 31 09:16:34 crc kubenswrapper[4783]: I0131 09:16:34.255529 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-t7nxg" podStartSLOduration=4.616277898 podStartE2EDuration="23.255525684s" podCreationTimestamp="2026-01-31 09:16:11 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.555141858 +0000 UTC m=+684.223825326" lastFinishedPulling="2026-01-31 09:16:32.194389644 +0000 UTC m=+702.863073112" observedRunningTime="2026-01-31 09:16:34.244233287 +0000 UTC m=+704.912916755" watchObservedRunningTime="2026-01-31 09:16:34.255525684 +0000 UTC m=+704.924209152" Jan 31 09:16:42 crc kubenswrapper[4783]: I0131 09:16:42.573470 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-x86p2" Jan 31 09:16:42 crc kubenswrapper[4783]: I0131 09:16:42.583261 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-x6rmh" Jan 31 09:16:42 crc kubenswrapper[4783]: I0131 09:16:42.588863 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-x6rmh" podStartSLOduration=10.337688479 podStartE2EDuration="30.588834481s" podCreationTimestamp="2026-01-31 09:16:12 +0000 UTC" firstStartedPulling="2026-01-31 09:16:13.4168048 +0000 UTC m=+684.085488268" lastFinishedPulling="2026-01-31 09:16:33.667950802 +0000 UTC m=+704.336634270" observedRunningTime="2026-01-31 09:16:34.293839707 +0000 UTC m=+704.962523174" watchObservedRunningTime="2026-01-31 09:16:42.588834481 +0000 UTC m=+713.257517950" Jan 31 09:16:42 crc kubenswrapper[4783]: I0131 09:16:42.603965 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-t7nxg" Jan 31 09:16:42 crc kubenswrapper[4783]: I0131 09:16:42.631559 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-5hw2c" Jan 31 09:16:42 crc kubenswrapper[4783]: I0131 09:16:42.678182 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-8dwmb" Jan 31 09:16:42 crc kubenswrapper[4783]: I0131 09:16:42.696106 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-9d95x" Jan 31 09:16:43 crc kubenswrapper[4783]: I0131 09:16:43.855612 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert\") pod \"infra-operator-controller-manager-79955696d6-rn96f\" (UID: \"5033c800-ef69-4228-a204-b66401c4725c\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" Jan 31 09:16:43 crc kubenswrapper[4783]: I0131 09:16:43.861611 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5033c800-ef69-4228-a204-b66401c4725c-cert\") pod \"infra-operator-controller-manager-79955696d6-rn96f\" (UID: \"5033c800-ef69-4228-a204-b66401c4725c\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" Jan 31 09:16:44 crc kubenswrapper[4783]: I0131 09:16:44.057230 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp\" (UID: \"10649b85-8b3a-44e2-9477-6f5821d232a7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" Jan 31 09:16:44 crc kubenswrapper[4783]: I0131 09:16:44.060725 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10649b85-8b3a-44e2-9477-6f5821d232a7-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp\" (UID: \"10649b85-8b3a-44e2-9477-6f5821d232a7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" Jan 31 09:16:44 crc kubenswrapper[4783]: I0131 09:16:44.071115 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" Jan 31 09:16:44 crc kubenswrapper[4783]: I0131 09:16:44.345949 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" Jan 31 09:16:44 crc kubenswrapper[4783]: I0131 09:16:44.361942 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:44 crc kubenswrapper[4783]: I0131 09:16:44.362020 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:44 crc kubenswrapper[4783]: I0131 09:16:44.365865 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:44 crc kubenswrapper[4783]: I0131 09:16:44.366394 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6360a87f-2ddf-4c17-9f25-cff4e0f5e747-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-jqzh7\" (UID: \"6360a87f-2ddf-4c17-9f25-cff4e0f5e747\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:44 crc kubenswrapper[4783]: I0131 09:16:44.374338 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:44 crc kubenswrapper[4783]: I0131 09:16:44.450475 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-rn96f"] Jan 31 09:16:44 crc kubenswrapper[4783]: W0131 09:16:44.459765 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5033c800_ef69_4228_a204_b66401c4725c.slice/crio-da75dc5c77de447d4a01dff910ae72eb8adfecc90c091cf3487829e50fcea142 WatchSource:0}: Error finding container da75dc5c77de447d4a01dff910ae72eb8adfecc90c091cf3487829e50fcea142: Status 404 returned error can't find the container with id da75dc5c77de447d4a01dff910ae72eb8adfecc90c091cf3487829e50fcea142 Jan 31 09:16:44 crc kubenswrapper[4783]: W0131 09:16:44.754951 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10649b85_8b3a_44e2_9477_6f5821d232a7.slice/crio-8c5f148132255eeaff4b9df5cff90a73a00e39f35cdaea30598ad4442586c340 WatchSource:0}: Error finding container 8c5f148132255eeaff4b9df5cff90a73a00e39f35cdaea30598ad4442586c340: Status 404 returned error can't find the container with id 8c5f148132255eeaff4b9df5cff90a73a00e39f35cdaea30598ad4442586c340 Jan 31 09:16:44 crc kubenswrapper[4783]: I0131 09:16:44.756045 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp"] Jan 31 09:16:44 crc kubenswrapper[4783]: I0131 09:16:44.822995 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7"] Jan 31 09:16:44 crc kubenswrapper[4783]: W0131 09:16:44.827234 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6360a87f_2ddf_4c17_9f25_cff4e0f5e747.slice/crio-c344aae08ec4bf97e2edf4bbb1a26d6e85860bb7f82a8c738fdd5b78db5d8353 WatchSource:0}: Error finding container c344aae08ec4bf97e2edf4bbb1a26d6e85860bb7f82a8c738fdd5b78db5d8353: Status 404 returned error can't find the container with id c344aae08ec4bf97e2edf4bbb1a26d6e85860bb7f82a8c738fdd5b78db5d8353 Jan 31 09:16:45 crc kubenswrapper[4783]: I0131 09:16:45.205822 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" event={"ID":"10649b85-8b3a-44e2-9477-6f5821d232a7","Type":"ContainerStarted","Data":"8c5f148132255eeaff4b9df5cff90a73a00e39f35cdaea30598ad4442586c340"} Jan 31 09:16:45 crc kubenswrapper[4783]: I0131 09:16:45.208022 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" event={"ID":"5033c800-ef69-4228-a204-b66401c4725c","Type":"ContainerStarted","Data":"da75dc5c77de447d4a01dff910ae72eb8adfecc90c091cf3487829e50fcea142"} Jan 31 09:16:45 crc kubenswrapper[4783]: I0131 09:16:45.209686 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" event={"ID":"6360a87f-2ddf-4c17-9f25-cff4e0f5e747","Type":"ContainerStarted","Data":"5d05dcf71a7b61159078ff90a8e354fe621d3465ae614baed30c461aae3aaba2"} Jan 31 09:16:45 crc kubenswrapper[4783]: I0131 09:16:45.209713 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" event={"ID":"6360a87f-2ddf-4c17-9f25-cff4e0f5e747","Type":"ContainerStarted","Data":"c344aae08ec4bf97e2edf4bbb1a26d6e85860bb7f82a8c738fdd5b78db5d8353"} Jan 31 09:16:45 crc kubenswrapper[4783]: I0131 09:16:45.209977 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:16:45 crc kubenswrapper[4783]: I0131 09:16:45.235411 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" podStartSLOduration=33.235393341 podStartE2EDuration="33.235393341s" podCreationTimestamp="2026-01-31 09:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:16:45.230368361 +0000 UTC m=+715.899051829" watchObservedRunningTime="2026-01-31 09:16:45.235393341 +0000 UTC m=+715.904076810" Jan 31 09:16:47 crc kubenswrapper[4783]: I0131 09:16:47.226518 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" event={"ID":"10649b85-8b3a-44e2-9477-6f5821d232a7","Type":"ContainerStarted","Data":"c6274b618df469015c740137e68585133591376d923a0adcc1fe697a8e145303"} Jan 31 09:16:47 crc kubenswrapper[4783]: I0131 09:16:47.226967 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" Jan 31 09:16:47 crc kubenswrapper[4783]: I0131 09:16:47.257980 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" podStartSLOduration=33.225660667 podStartE2EDuration="35.257969973s" podCreationTimestamp="2026-01-31 09:16:12 +0000 UTC" firstStartedPulling="2026-01-31 09:16:44.757200468 +0000 UTC m=+715.425883936" lastFinishedPulling="2026-01-31 09:16:46.789509773 +0000 UTC m=+717.458193242" observedRunningTime="2026-01-31 09:16:47.250755597 +0000 UTC m=+717.919439075" watchObservedRunningTime="2026-01-31 09:16:47.257969973 +0000 UTC m=+717.926653441" Jan 31 09:16:47 crc kubenswrapper[4783]: I0131 09:16:47.756523 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:16:47 crc kubenswrapper[4783]: I0131 09:16:47.757026 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:16:49 crc kubenswrapper[4783]: I0131 09:16:49.240559 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" event={"ID":"5033c800-ef69-4228-a204-b66401c4725c","Type":"ContainerStarted","Data":"fa9ebc3663d53e73df131c04bf5b3459283fb98c56bb9e9cfb57c5d7f32fb83e"} Jan 31 09:16:49 crc kubenswrapper[4783]: I0131 09:16:49.240760 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" Jan 31 09:16:49 crc kubenswrapper[4783]: I0131 09:16:49.257481 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" podStartSLOduration=34.571970466 podStartE2EDuration="38.257470796s" podCreationTimestamp="2026-01-31 09:16:11 +0000 UTC" firstStartedPulling="2026-01-31 09:16:44.4629537 +0000 UTC m=+715.131637169" lastFinishedPulling="2026-01-31 09:16:48.148454031 +0000 UTC m=+718.817137499" observedRunningTime="2026-01-31 09:16:49.254225571 +0000 UTC m=+719.922909039" watchObservedRunningTime="2026-01-31 09:16:49.257470796 +0000 UTC m=+719.926154265" Jan 31 09:16:54 crc kubenswrapper[4783]: I0131 09:16:54.077617 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-rn96f" Jan 31 09:16:54 crc kubenswrapper[4783]: I0131 09:16:54.352052 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp" Jan 31 09:16:54 crc kubenswrapper[4783]: I0131 09:16:54.381918 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-jqzh7" Jan 31 09:17:00 crc kubenswrapper[4783]: I0131 09:17:00.623535 4783 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.501923 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-4rmlb"] Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.503400 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-4rmlb" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.506058 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.506498 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.506638 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.507129 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-nzhpf" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.509399 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-4rmlb"] Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.543207 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-cj5n5"] Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.544288 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-cj5n5" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.545206 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-cj5n5"] Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.547733 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.591574 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w58tt\" (UniqueName: \"kubernetes.io/projected/6516f5a8-f446-4407-a048-95068a0b03a0-kube-api-access-w58tt\") pod \"dnsmasq-dns-84bb9d8bd9-4rmlb\" (UID: \"6516f5a8-f446-4407-a048-95068a0b03a0\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-4rmlb" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.591619 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6516f5a8-f446-4407-a048-95068a0b03a0-config\") pod \"dnsmasq-dns-84bb9d8bd9-4rmlb\" (UID: \"6516f5a8-f446-4407-a048-95068a0b03a0\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-4rmlb" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.693330 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de8c1882-a305-4f69-adb7-3551b7f8f779-config\") pod \"dnsmasq-dns-5f854695bc-cj5n5\" (UID: \"de8c1882-a305-4f69-adb7-3551b7f8f779\") " pod="openstack/dnsmasq-dns-5f854695bc-cj5n5" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.693376 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de8c1882-a305-4f69-adb7-3551b7f8f779-dns-svc\") pod \"dnsmasq-dns-5f854695bc-cj5n5\" (UID: \"de8c1882-a305-4f69-adb7-3551b7f8f779\") " pod="openstack/dnsmasq-dns-5f854695bc-cj5n5" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.693431 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w58tt\" (UniqueName: \"kubernetes.io/projected/6516f5a8-f446-4407-a048-95068a0b03a0-kube-api-access-w58tt\") pod \"dnsmasq-dns-84bb9d8bd9-4rmlb\" (UID: \"6516f5a8-f446-4407-a048-95068a0b03a0\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-4rmlb" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.693457 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6516f5a8-f446-4407-a048-95068a0b03a0-config\") pod \"dnsmasq-dns-84bb9d8bd9-4rmlb\" (UID: \"6516f5a8-f446-4407-a048-95068a0b03a0\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-4rmlb" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.693507 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl9ff\" (UniqueName: \"kubernetes.io/projected/de8c1882-a305-4f69-adb7-3551b7f8f779-kube-api-access-wl9ff\") pod \"dnsmasq-dns-5f854695bc-cj5n5\" (UID: \"de8c1882-a305-4f69-adb7-3551b7f8f779\") " pod="openstack/dnsmasq-dns-5f854695bc-cj5n5" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.694513 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6516f5a8-f446-4407-a048-95068a0b03a0-config\") pod \"dnsmasq-dns-84bb9d8bd9-4rmlb\" (UID: \"6516f5a8-f446-4407-a048-95068a0b03a0\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-4rmlb" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.713060 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w58tt\" (UniqueName: \"kubernetes.io/projected/6516f5a8-f446-4407-a048-95068a0b03a0-kube-api-access-w58tt\") pod \"dnsmasq-dns-84bb9d8bd9-4rmlb\" (UID: \"6516f5a8-f446-4407-a048-95068a0b03a0\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-4rmlb" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.794421 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl9ff\" (UniqueName: \"kubernetes.io/projected/de8c1882-a305-4f69-adb7-3551b7f8f779-kube-api-access-wl9ff\") pod \"dnsmasq-dns-5f854695bc-cj5n5\" (UID: \"de8c1882-a305-4f69-adb7-3551b7f8f779\") " pod="openstack/dnsmasq-dns-5f854695bc-cj5n5" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.794744 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de8c1882-a305-4f69-adb7-3551b7f8f779-config\") pod \"dnsmasq-dns-5f854695bc-cj5n5\" (UID: \"de8c1882-a305-4f69-adb7-3551b7f8f779\") " pod="openstack/dnsmasq-dns-5f854695bc-cj5n5" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.794768 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de8c1882-a305-4f69-adb7-3551b7f8f779-dns-svc\") pod \"dnsmasq-dns-5f854695bc-cj5n5\" (UID: \"de8c1882-a305-4f69-adb7-3551b7f8f779\") " pod="openstack/dnsmasq-dns-5f854695bc-cj5n5" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.795696 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de8c1882-a305-4f69-adb7-3551b7f8f779-config\") pod \"dnsmasq-dns-5f854695bc-cj5n5\" (UID: \"de8c1882-a305-4f69-adb7-3551b7f8f779\") " pod="openstack/dnsmasq-dns-5f854695bc-cj5n5" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.795715 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de8c1882-a305-4f69-adb7-3551b7f8f779-dns-svc\") pod \"dnsmasq-dns-5f854695bc-cj5n5\" (UID: \"de8c1882-a305-4f69-adb7-3551b7f8f779\") " pod="openstack/dnsmasq-dns-5f854695bc-cj5n5" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.811150 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl9ff\" (UniqueName: \"kubernetes.io/projected/de8c1882-a305-4f69-adb7-3551b7f8f779-kube-api-access-wl9ff\") pod \"dnsmasq-dns-5f854695bc-cj5n5\" (UID: \"de8c1882-a305-4f69-adb7-3551b7f8f779\") " pod="openstack/dnsmasq-dns-5f854695bc-cj5n5" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.819053 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-4rmlb" Jan 31 09:17:09 crc kubenswrapper[4783]: I0131 09:17:09.869033 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-cj5n5" Jan 31 09:17:10 crc kubenswrapper[4783]: I0131 09:17:10.195883 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-4rmlb"] Jan 31 09:17:10 crc kubenswrapper[4783]: I0131 09:17:10.252630 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-cj5n5"] Jan 31 09:17:10 crc kubenswrapper[4783]: W0131 09:17:10.256401 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde8c1882_a305_4f69_adb7_3551b7f8f779.slice/crio-28de23d4c1c674cd760a0b93322cdad80b4f02e88fb0d751805be846f3fca0ca WatchSource:0}: Error finding container 28de23d4c1c674cd760a0b93322cdad80b4f02e88fb0d751805be846f3fca0ca: Status 404 returned error can't find the container with id 28de23d4c1c674cd760a0b93322cdad80b4f02e88fb0d751805be846f3fca0ca Jan 31 09:17:10 crc kubenswrapper[4783]: I0131 09:17:10.379172 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-cj5n5" event={"ID":"de8c1882-a305-4f69-adb7-3551b7f8f779","Type":"ContainerStarted","Data":"28de23d4c1c674cd760a0b93322cdad80b4f02e88fb0d751805be846f3fca0ca"} Jan 31 09:17:10 crc kubenswrapper[4783]: I0131 09:17:10.380031 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-4rmlb" event={"ID":"6516f5a8-f446-4407-a048-95068a0b03a0","Type":"ContainerStarted","Data":"6c40c81b500fa6d748c3f7357dd01dc0b0287f5212c7849c9c3ed050b30937bc"} Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.292050 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-cj5n5"] Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.320285 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-h29qg"] Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.340500 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-h29qg"] Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.340805 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.441588 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/009682a7-982d-4cc3-9d7f-704c0c7c8d84-config\") pod \"dnsmasq-dns-744ffd65bc-h29qg\" (UID: \"009682a7-982d-4cc3-9d7f-704c0c7c8d84\") " pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.441742 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/009682a7-982d-4cc3-9d7f-704c0c7c8d84-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-h29qg\" (UID: \"009682a7-982d-4cc3-9d7f-704c0c7c8d84\") " pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.441831 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szmk5\" (UniqueName: \"kubernetes.io/projected/009682a7-982d-4cc3-9d7f-704c0c7c8d84-kube-api-access-szmk5\") pod \"dnsmasq-dns-744ffd65bc-h29qg\" (UID: \"009682a7-982d-4cc3-9d7f-704c0c7c8d84\") " pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.542264 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-4rmlb"] Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.543923 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/009682a7-982d-4cc3-9d7f-704c0c7c8d84-config\") pod \"dnsmasq-dns-744ffd65bc-h29qg\" (UID: \"009682a7-982d-4cc3-9d7f-704c0c7c8d84\") " pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.543986 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/009682a7-982d-4cc3-9d7f-704c0c7c8d84-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-h29qg\" (UID: \"009682a7-982d-4cc3-9d7f-704c0c7c8d84\") " pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.544023 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szmk5\" (UniqueName: \"kubernetes.io/projected/009682a7-982d-4cc3-9d7f-704c0c7c8d84-kube-api-access-szmk5\") pod \"dnsmasq-dns-744ffd65bc-h29qg\" (UID: \"009682a7-982d-4cc3-9d7f-704c0c7c8d84\") " pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.545003 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/009682a7-982d-4cc3-9d7f-704c0c7c8d84-config\") pod \"dnsmasq-dns-744ffd65bc-h29qg\" (UID: \"009682a7-982d-4cc3-9d7f-704c0c7c8d84\") " pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.545758 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/009682a7-982d-4cc3-9d7f-704c0c7c8d84-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-h29qg\" (UID: \"009682a7-982d-4cc3-9d7f-704c0c7c8d84\") " pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.563593 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-8mf27"] Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.564975 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-8mf27" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.572329 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-8mf27"] Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.572774 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szmk5\" (UniqueName: \"kubernetes.io/projected/009682a7-982d-4cc3-9d7f-704c0c7c8d84-kube-api-access-szmk5\") pod \"dnsmasq-dns-744ffd65bc-h29qg\" (UID: \"009682a7-982d-4cc3-9d7f-704c0c7c8d84\") " pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.670325 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.750996 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6v5c\" (UniqueName: \"kubernetes.io/projected/80811804-f71b-48dc-873f-8583a6b3e785-kube-api-access-f6v5c\") pod \"dnsmasq-dns-95f5f6995-8mf27\" (UID: \"80811804-f71b-48dc-873f-8583a6b3e785\") " pod="openstack/dnsmasq-dns-95f5f6995-8mf27" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.751252 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80811804-f71b-48dc-873f-8583a6b3e785-dns-svc\") pod \"dnsmasq-dns-95f5f6995-8mf27\" (UID: \"80811804-f71b-48dc-873f-8583a6b3e785\") " pod="openstack/dnsmasq-dns-95f5f6995-8mf27" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.751296 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80811804-f71b-48dc-873f-8583a6b3e785-config\") pod \"dnsmasq-dns-95f5f6995-8mf27\" (UID: \"80811804-f71b-48dc-873f-8583a6b3e785\") " pod="openstack/dnsmasq-dns-95f5f6995-8mf27" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.852947 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80811804-f71b-48dc-873f-8583a6b3e785-dns-svc\") pod \"dnsmasq-dns-95f5f6995-8mf27\" (UID: \"80811804-f71b-48dc-873f-8583a6b3e785\") " pod="openstack/dnsmasq-dns-95f5f6995-8mf27" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.853255 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80811804-f71b-48dc-873f-8583a6b3e785-config\") pod \"dnsmasq-dns-95f5f6995-8mf27\" (UID: \"80811804-f71b-48dc-873f-8583a6b3e785\") " pod="openstack/dnsmasq-dns-95f5f6995-8mf27" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.853304 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6v5c\" (UniqueName: \"kubernetes.io/projected/80811804-f71b-48dc-873f-8583a6b3e785-kube-api-access-f6v5c\") pod \"dnsmasq-dns-95f5f6995-8mf27\" (UID: \"80811804-f71b-48dc-873f-8583a6b3e785\") " pod="openstack/dnsmasq-dns-95f5f6995-8mf27" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.853746 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80811804-f71b-48dc-873f-8583a6b3e785-dns-svc\") pod \"dnsmasq-dns-95f5f6995-8mf27\" (UID: \"80811804-f71b-48dc-873f-8583a6b3e785\") " pod="openstack/dnsmasq-dns-95f5f6995-8mf27" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.853852 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80811804-f71b-48dc-873f-8583a6b3e785-config\") pod \"dnsmasq-dns-95f5f6995-8mf27\" (UID: \"80811804-f71b-48dc-873f-8583a6b3e785\") " pod="openstack/dnsmasq-dns-95f5f6995-8mf27" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.867274 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6v5c\" (UniqueName: \"kubernetes.io/projected/80811804-f71b-48dc-873f-8583a6b3e785-kube-api-access-f6v5c\") pod \"dnsmasq-dns-95f5f6995-8mf27\" (UID: \"80811804-f71b-48dc-873f-8583a6b3e785\") " pod="openstack/dnsmasq-dns-95f5f6995-8mf27" Jan 31 09:17:12 crc kubenswrapper[4783]: I0131 09:17:12.898751 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-8mf27" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.070829 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-h29qg"] Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.448933 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.450584 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.453951 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dcjql" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.453964 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.454404 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.454467 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.455957 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.456087 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.456107 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.460193 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.564239 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-config-data\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.564273 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.564312 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnp8h\" (UniqueName: \"kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-kube-api-access-vnp8h\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.564343 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e44f3996-11b5-4095-a1f3-e1bc24974386-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.564537 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.564609 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e44f3996-11b5-4095-a1f3-e1bc24974386-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.564653 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.564678 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.564800 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.564920 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.564957 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.667364 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.667735 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e44f3996-11b5-4095-a1f3-e1bc24974386-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.667792 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.667809 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.668045 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.668200 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.668227 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.668397 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-config-data\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.668514 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.668544 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e44f3996-11b5-4095-a1f3-e1bc24974386-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.668558 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnp8h\" (UniqueName: \"kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-kube-api-access-vnp8h\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.669481 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.670728 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.670953 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.677866 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.683402 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-config-data\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.683910 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e44f3996-11b5-4095-a1f3-e1bc24974386-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.684656 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.686107 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e44f3996-11b5-4095-a1f3-e1bc24974386-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.686119 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.688874 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.690113 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.691859 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.692283 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.692427 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.693764 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnp8h\" (UniqueName: \"kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-kube-api-access-vnp8h\") pod \"rabbitmq-server-0\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.697335 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.697338 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.697545 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.697594 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.697753 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.697793 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.697878 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rs8js" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.775780 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.875894 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.875981 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.876011 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1aa1eeb1-d389-4933-a40b-3383b28597c2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.876041 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.876083 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.876137 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.876265 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.876504 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.877803 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kprh4\" (UniqueName: \"kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-kube-api-access-kprh4\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.877836 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1aa1eeb1-d389-4933-a40b-3383b28597c2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.877911 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.980904 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.980954 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.980989 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.981022 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1aa1eeb1-d389-4933-a40b-3383b28597c2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.981043 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kprh4\" (UniqueName: \"kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-kube-api-access-kprh4\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.981093 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.981121 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.981221 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.981242 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1aa1eeb1-d389-4933-a40b-3383b28597c2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.981261 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.981282 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.981727 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.981897 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.982097 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.982224 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.982644 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.983310 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.984965 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1aa1eeb1-d389-4933-a40b-3383b28597c2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.985491 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.989560 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1aa1eeb1-d389-4933-a40b-3383b28597c2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.991814 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.996956 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kprh4\" (UniqueName: \"kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-kube-api-access-kprh4\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:13 crc kubenswrapper[4783]: I0131 09:17:13.998661 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:14 crc kubenswrapper[4783]: I0131 09:17:14.046174 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:17:14 crc kubenswrapper[4783]: I0131 09:17:14.958205 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 31 09:17:14 crc kubenswrapper[4783]: I0131 09:17:14.960026 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 31 09:17:14 crc kubenswrapper[4783]: I0131 09:17:14.972288 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-5sfvd" Jan 31 09:17:14 crc kubenswrapper[4783]: I0131 09:17:14.977299 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 31 09:17:14 crc kubenswrapper[4783]: I0131 09:17:14.978518 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 31 09:17:14 crc kubenswrapper[4783]: I0131 09:17:14.978627 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 31 09:17:14 crc kubenswrapper[4783]: I0131 09:17:14.983628 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 31 09:17:14 crc kubenswrapper[4783]: I0131 09:17:14.989270 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.099970 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn72d\" (UniqueName: \"kubernetes.io/projected/03eade59-3312-49be-a51a-9fdcd37f9a33-kube-api-access-mn72d\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.100065 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/03eade59-3312-49be-a51a-9fdcd37f9a33-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.100188 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/03eade59-3312-49be-a51a-9fdcd37f9a33-config-data-default\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.100217 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03eade59-3312-49be-a51a-9fdcd37f9a33-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.100254 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/03eade59-3312-49be-a51a-9fdcd37f9a33-config-data-generated\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.100283 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03eade59-3312-49be-a51a-9fdcd37f9a33-operator-scripts\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.100317 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03eade59-3312-49be-a51a-9fdcd37f9a33-kolla-config\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.100367 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.207847 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/03eade59-3312-49be-a51a-9fdcd37f9a33-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.208025 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/03eade59-3312-49be-a51a-9fdcd37f9a33-config-data-default\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.208062 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03eade59-3312-49be-a51a-9fdcd37f9a33-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.208110 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/03eade59-3312-49be-a51a-9fdcd37f9a33-config-data-generated\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.208140 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03eade59-3312-49be-a51a-9fdcd37f9a33-operator-scripts\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.208201 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03eade59-3312-49be-a51a-9fdcd37f9a33-kolla-config\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.208233 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.208366 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn72d\" (UniqueName: \"kubernetes.io/projected/03eade59-3312-49be-a51a-9fdcd37f9a33-kube-api-access-mn72d\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.209852 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/03eade59-3312-49be-a51a-9fdcd37f9a33-config-data-generated\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.210946 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/03eade59-3312-49be-a51a-9fdcd37f9a33-config-data-default\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.212561 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.212873 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03eade59-3312-49be-a51a-9fdcd37f9a33-kolla-config\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.214594 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/03eade59-3312-49be-a51a-9fdcd37f9a33-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.215531 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03eade59-3312-49be-a51a-9fdcd37f9a33-operator-scripts\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.222834 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03eade59-3312-49be-a51a-9fdcd37f9a33-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.232910 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn72d\" (UniqueName: \"kubernetes.io/projected/03eade59-3312-49be-a51a-9fdcd37f9a33-kube-api-access-mn72d\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.243356 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"03eade59-3312-49be-a51a-9fdcd37f9a33\") " pod="openstack/openstack-galera-0" Jan 31 09:17:15 crc kubenswrapper[4783]: I0131 09:17:15.286612 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.312525 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.313943 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.320378 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.320594 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-djxz7" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.320744 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.323570 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.327941 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.429570 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c9a5bd57-8542-4509-a620-c48d2f6c9e06-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.429628 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a5bd57-8542-4509-a620-c48d2f6c9e06-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.429842 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c9a5bd57-8542-4509-a620-c48d2f6c9e06-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.429888 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl64r\" (UniqueName: \"kubernetes.io/projected/c9a5bd57-8542-4509-a620-c48d2f6c9e06-kube-api-access-wl64r\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.429994 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c9a5bd57-8542-4509-a620-c48d2f6c9e06-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.430054 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.430152 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9a5bd57-8542-4509-a620-c48d2f6c9e06-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.430264 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a5bd57-8542-4509-a620-c48d2f6c9e06-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.437873 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" event={"ID":"009682a7-982d-4cc3-9d7f-704c0c7c8d84","Type":"ContainerStarted","Data":"c80572e43e1bedfb4db431e6822af3e7b60be2bb7ab73dfd114ca55becb2240e"} Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.532559 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c9a5bd57-8542-4509-a620-c48d2f6c9e06-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.532594 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl64r\" (UniqueName: \"kubernetes.io/projected/c9a5bd57-8542-4509-a620-c48d2f6c9e06-kube-api-access-wl64r\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.532643 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c9a5bd57-8542-4509-a620-c48d2f6c9e06-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.532683 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.532752 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9a5bd57-8542-4509-a620-c48d2f6c9e06-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.532810 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a5bd57-8542-4509-a620-c48d2f6c9e06-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.532895 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c9a5bd57-8542-4509-a620-c48d2f6c9e06-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.532918 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a5bd57-8542-4509-a620-c48d2f6c9e06-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.533211 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c9a5bd57-8542-4509-a620-c48d2f6c9e06-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.533875 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c9a5bd57-8542-4509-a620-c48d2f6c9e06-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.534427 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a5bd57-8542-4509-a620-c48d2f6c9e06-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.534689 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.539181 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a5bd57-8542-4509-a620-c48d2f6c9e06-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.542671 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c9a5bd57-8542-4509-a620-c48d2f6c9e06-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.547505 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl64r\" (UniqueName: \"kubernetes.io/projected/c9a5bd57-8542-4509-a620-c48d2f6c9e06-kube-api-access-wl64r\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.550980 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9a5bd57-8542-4509-a620-c48d2f6c9e06-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.556453 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c9a5bd57-8542-4509-a620-c48d2f6c9e06\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.630840 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.765282 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.766549 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.768824 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.768824 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-fsdzv" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.768823 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.776290 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.940003 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c967214f-ce54-4ac2-ae54-2d750133ff97-config-data\") pod \"memcached-0\" (UID: \"c967214f-ce54-4ac2-ae54-2d750133ff97\") " pod="openstack/memcached-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.940064 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dpbf\" (UniqueName: \"kubernetes.io/projected/c967214f-ce54-4ac2-ae54-2d750133ff97-kube-api-access-4dpbf\") pod \"memcached-0\" (UID: \"c967214f-ce54-4ac2-ae54-2d750133ff97\") " pod="openstack/memcached-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.940147 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c967214f-ce54-4ac2-ae54-2d750133ff97-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c967214f-ce54-4ac2-ae54-2d750133ff97\") " pod="openstack/memcached-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.940201 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c967214f-ce54-4ac2-ae54-2d750133ff97-kolla-config\") pod \"memcached-0\" (UID: \"c967214f-ce54-4ac2-ae54-2d750133ff97\") " pod="openstack/memcached-0" Jan 31 09:17:16 crc kubenswrapper[4783]: I0131 09:17:16.940220 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c967214f-ce54-4ac2-ae54-2d750133ff97-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c967214f-ce54-4ac2-ae54-2d750133ff97\") " pod="openstack/memcached-0" Jan 31 09:17:17 crc kubenswrapper[4783]: I0131 09:17:17.042102 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c967214f-ce54-4ac2-ae54-2d750133ff97-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c967214f-ce54-4ac2-ae54-2d750133ff97\") " pod="openstack/memcached-0" Jan 31 09:17:17 crc kubenswrapper[4783]: I0131 09:17:17.042179 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c967214f-ce54-4ac2-ae54-2d750133ff97-kolla-config\") pod \"memcached-0\" (UID: \"c967214f-ce54-4ac2-ae54-2d750133ff97\") " pod="openstack/memcached-0" Jan 31 09:17:17 crc kubenswrapper[4783]: I0131 09:17:17.042345 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c967214f-ce54-4ac2-ae54-2d750133ff97-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c967214f-ce54-4ac2-ae54-2d750133ff97\") " pod="openstack/memcached-0" Jan 31 09:17:17 crc kubenswrapper[4783]: I0131 09:17:17.042496 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c967214f-ce54-4ac2-ae54-2d750133ff97-config-data\") pod \"memcached-0\" (UID: \"c967214f-ce54-4ac2-ae54-2d750133ff97\") " pod="openstack/memcached-0" Jan 31 09:17:17 crc kubenswrapper[4783]: I0131 09:17:17.042538 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dpbf\" (UniqueName: \"kubernetes.io/projected/c967214f-ce54-4ac2-ae54-2d750133ff97-kube-api-access-4dpbf\") pod \"memcached-0\" (UID: \"c967214f-ce54-4ac2-ae54-2d750133ff97\") " pod="openstack/memcached-0" Jan 31 09:17:17 crc kubenswrapper[4783]: I0131 09:17:17.042854 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c967214f-ce54-4ac2-ae54-2d750133ff97-kolla-config\") pod \"memcached-0\" (UID: \"c967214f-ce54-4ac2-ae54-2d750133ff97\") " pod="openstack/memcached-0" Jan 31 09:17:17 crc kubenswrapper[4783]: I0131 09:17:17.043043 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c967214f-ce54-4ac2-ae54-2d750133ff97-config-data\") pod \"memcached-0\" (UID: \"c967214f-ce54-4ac2-ae54-2d750133ff97\") " pod="openstack/memcached-0" Jan 31 09:17:17 crc kubenswrapper[4783]: I0131 09:17:17.047959 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c967214f-ce54-4ac2-ae54-2d750133ff97-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c967214f-ce54-4ac2-ae54-2d750133ff97\") " pod="openstack/memcached-0" Jan 31 09:17:17 crc kubenswrapper[4783]: I0131 09:17:17.050863 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c967214f-ce54-4ac2-ae54-2d750133ff97-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c967214f-ce54-4ac2-ae54-2d750133ff97\") " pod="openstack/memcached-0" Jan 31 09:17:17 crc kubenswrapper[4783]: I0131 09:17:17.056962 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dpbf\" (UniqueName: \"kubernetes.io/projected/c967214f-ce54-4ac2-ae54-2d750133ff97-kube-api-access-4dpbf\") pod \"memcached-0\" (UID: \"c967214f-ce54-4ac2-ae54-2d750133ff97\") " pod="openstack/memcached-0" Jan 31 09:17:17 crc kubenswrapper[4783]: I0131 09:17:17.092015 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 31 09:17:17 crc kubenswrapper[4783]: I0131 09:17:17.756995 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:17:17 crc kubenswrapper[4783]: I0131 09:17:17.757105 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:17:17 crc kubenswrapper[4783]: I0131 09:17:17.757212 4783 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:17:17 crc kubenswrapper[4783]: I0131 09:17:17.759055 4783 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"63dec065c0e2ce55bb88687151f12c6eb92203eb247bb4dce8e626a9b6254663"} pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:17:17 crc kubenswrapper[4783]: I0131 09:17:17.759152 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" containerID="cri-o://63dec065c0e2ce55bb88687151f12c6eb92203eb247bb4dce8e626a9b6254663" gracePeriod=600 Jan 31 09:17:18 crc kubenswrapper[4783]: I0131 09:17:18.460147 4783 generic.go:334] "Generic (PLEG): container finished" podID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerID="63dec065c0e2ce55bb88687151f12c6eb92203eb247bb4dce8e626a9b6254663" exitCode=0 Jan 31 09:17:18 crc kubenswrapper[4783]: I0131 09:17:18.460231 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerDied","Data":"63dec065c0e2ce55bb88687151f12c6eb92203eb247bb4dce8e626a9b6254663"} Jan 31 09:17:18 crc kubenswrapper[4783]: I0131 09:17:18.460588 4783 scope.go:117] "RemoveContainer" containerID="fa19abb52300978825d77b73571f5c020e68f8b7df94a01ba156241b5ff00d6c" Jan 31 09:17:18 crc kubenswrapper[4783]: I0131 09:17:18.543242 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:17:18 crc kubenswrapper[4783]: I0131 09:17:18.545312 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 31 09:17:18 crc kubenswrapper[4783]: I0131 09:17:18.548418 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-zhbt7" Jan 31 09:17:18 crc kubenswrapper[4783]: I0131 09:17:18.558150 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:17:18 crc kubenswrapper[4783]: I0131 09:17:18.578681 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsx6s\" (UniqueName: \"kubernetes.io/projected/4d24bc22-990e-4f9f-a39b-f16adc63dfbb-kube-api-access-xsx6s\") pod \"kube-state-metrics-0\" (UID: \"4d24bc22-990e-4f9f-a39b-f16adc63dfbb\") " pod="openstack/kube-state-metrics-0" Jan 31 09:17:18 crc kubenswrapper[4783]: I0131 09:17:18.683091 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsx6s\" (UniqueName: \"kubernetes.io/projected/4d24bc22-990e-4f9f-a39b-f16adc63dfbb-kube-api-access-xsx6s\") pod \"kube-state-metrics-0\" (UID: \"4d24bc22-990e-4f9f-a39b-f16adc63dfbb\") " pod="openstack/kube-state-metrics-0" Jan 31 09:17:18 crc kubenswrapper[4783]: I0131 09:17:18.722134 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsx6s\" (UniqueName: \"kubernetes.io/projected/4d24bc22-990e-4f9f-a39b-f16adc63dfbb-kube-api-access-xsx6s\") pod \"kube-state-metrics-0\" (UID: \"4d24bc22-990e-4f9f-a39b-f16adc63dfbb\") " pod="openstack/kube-state-metrics-0" Jan 31 09:17:18 crc kubenswrapper[4783]: I0131 09:17:18.864142 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 31 09:17:21 crc kubenswrapper[4783]: I0131 09:17:21.671198 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.095029 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rll65"] Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.101204 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.105594 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.105793 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.105912 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rk7ds" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.106157 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rll65"] Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.140001 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-k7st6"] Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.142404 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.146063 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea827790-18ef-4c55-8b5f-365ead9b9f6c-combined-ca-bundle\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.146109 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea827790-18ef-4c55-8b5f-365ead9b9f6c-ovn-controller-tls-certs\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.146135 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8575964a-bedb-456c-b992-116f66bb7fa2-etc-ovs\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.146180 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8575964a-bedb-456c-b992-116f66bb7fa2-var-lib\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.146198 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ddgw\" (UniqueName: \"kubernetes.io/projected/8575964a-bedb-456c-b992-116f66bb7fa2-kube-api-access-6ddgw\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.146219 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8575964a-bedb-456c-b992-116f66bb7fa2-var-run\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.146239 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntvtj\" (UniqueName: \"kubernetes.io/projected/ea827790-18ef-4c55-8b5f-365ead9b9f6c-kube-api-access-ntvtj\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.146271 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea827790-18ef-4c55-8b5f-365ead9b9f6c-var-run\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.146303 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea827790-18ef-4c55-8b5f-365ead9b9f6c-scripts\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.146349 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8575964a-bedb-456c-b992-116f66bb7fa2-scripts\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.146373 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea827790-18ef-4c55-8b5f-365ead9b9f6c-var-run-ovn\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.146562 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea827790-18ef-4c55-8b5f-365ead9b9f6c-var-log-ovn\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.146598 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8575964a-bedb-456c-b992-116f66bb7fa2-var-log\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.147257 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-k7st6"] Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.248994 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8575964a-bedb-456c-b992-116f66bb7fa2-scripts\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.249042 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea827790-18ef-4c55-8b5f-365ead9b9f6c-var-run-ovn\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.249076 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea827790-18ef-4c55-8b5f-365ead9b9f6c-var-log-ovn\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.249101 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8575964a-bedb-456c-b992-116f66bb7fa2-var-log\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.249125 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea827790-18ef-4c55-8b5f-365ead9b9f6c-combined-ca-bundle\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.249171 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea827790-18ef-4c55-8b5f-365ead9b9f6c-ovn-controller-tls-certs\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.249189 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8575964a-bedb-456c-b992-116f66bb7fa2-etc-ovs\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.249213 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8575964a-bedb-456c-b992-116f66bb7fa2-var-lib\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.249228 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ddgw\" (UniqueName: \"kubernetes.io/projected/8575964a-bedb-456c-b992-116f66bb7fa2-kube-api-access-6ddgw\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.249250 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8575964a-bedb-456c-b992-116f66bb7fa2-var-run\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.249269 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntvtj\" (UniqueName: \"kubernetes.io/projected/ea827790-18ef-4c55-8b5f-365ead9b9f6c-kube-api-access-ntvtj\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.249287 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea827790-18ef-4c55-8b5f-365ead9b9f6c-var-run\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.249314 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea827790-18ef-4c55-8b5f-365ead9b9f6c-scripts\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.249792 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8575964a-bedb-456c-b992-116f66bb7fa2-etc-ovs\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.251195 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8575964a-bedb-456c-b992-116f66bb7fa2-var-log\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.251284 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea827790-18ef-4c55-8b5f-365ead9b9f6c-var-log-ovn\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.251423 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8575964a-bedb-456c-b992-116f66bb7fa2-var-lib\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.251548 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8575964a-bedb-456c-b992-116f66bb7fa2-var-run\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.251564 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea827790-18ef-4c55-8b5f-365ead9b9f6c-var-run\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.251648 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea827790-18ef-4c55-8b5f-365ead9b9f6c-var-run-ovn\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.252756 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea827790-18ef-4c55-8b5f-365ead9b9f6c-scripts\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.257119 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8575964a-bedb-456c-b992-116f66bb7fa2-scripts\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.264105 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea827790-18ef-4c55-8b5f-365ead9b9f6c-ovn-controller-tls-certs\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.264141 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea827790-18ef-4c55-8b5f-365ead9b9f6c-combined-ca-bundle\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.270681 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ddgw\" (UniqueName: \"kubernetes.io/projected/8575964a-bedb-456c-b992-116f66bb7fa2-kube-api-access-6ddgw\") pod \"ovn-controller-ovs-k7st6\" (UID: \"8575964a-bedb-456c-b992-116f66bb7fa2\") " pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.274029 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntvtj\" (UniqueName: \"kubernetes.io/projected/ea827790-18ef-4c55-8b5f-365ead9b9f6c-kube-api-access-ntvtj\") pod \"ovn-controller-rll65\" (UID: \"ea827790-18ef-4c55-8b5f-365ead9b9f6c\") " pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.408861 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-8mf27"] Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.426675 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rll65" Jan 31 09:17:22 crc kubenswrapper[4783]: W0131 09:17:22.431899 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80811804_f71b_48dc_873f_8583a6b3e785.slice/crio-07f9ac3622a49f292d8093687055660872f566c20da13d39a4992eefb7d25178 WatchSource:0}: Error finding container 07f9ac3622a49f292d8093687055660872f566c20da13d39a4992eefb7d25178: Status 404 returned error can't find the container with id 07f9ac3622a49f292d8093687055660872f566c20da13d39a4992eefb7d25178 Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.476925 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.495173 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1aa1eeb1-d389-4933-a40b-3383b28597c2","Type":"ContainerStarted","Data":"31f705b1312ba3a9b8bcaff19857e3818870b150542ad99f835a9eca30384dbc"} Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.496519 4783 generic.go:334] "Generic (PLEG): container finished" podID="009682a7-982d-4cc3-9d7f-704c0c7c8d84" containerID="1b2b225ab522194da1bccddc9113dd6c8a852ee113c3de778c208b32d435c8a0" exitCode=0 Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.496575 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" event={"ID":"009682a7-982d-4cc3-9d7f-704c0c7c8d84","Type":"ContainerDied","Data":"1b2b225ab522194da1bccddc9113dd6c8a852ee113c3de778c208b32d435c8a0"} Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.501095 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerStarted","Data":"a28aa009c9b1798b35b666e609764f43f71694d4a62c6d2fec1ffdd0fb94bbed"} Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.502108 4783 generic.go:334] "Generic (PLEG): container finished" podID="de8c1882-a305-4f69-adb7-3551b7f8f779" containerID="3d6f3807eae9297104d7ce8cc92d2e5feea1b398cf26f1a1aacfdb1a787e9f82" exitCode=0 Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.502152 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-cj5n5" event={"ID":"de8c1882-a305-4f69-adb7-3551b7f8f779","Type":"ContainerDied","Data":"3d6f3807eae9297104d7ce8cc92d2e5feea1b398cf26f1a1aacfdb1a787e9f82"} Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.505957 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-8mf27" event={"ID":"80811804-f71b-48dc-873f-8583a6b3e785","Type":"ContainerStarted","Data":"07f9ac3622a49f292d8093687055660872f566c20da13d39a4992eefb7d25178"} Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.612450 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.638006 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 31 09:17:22 crc kubenswrapper[4783]: W0131 09:17:22.641729 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03eade59_3312_49be_a51a_9fdcd37f9a33.slice/crio-5f04d98a00301f9b520b51a1fa30412e8df84b3d6965b31f5601199f1f4cdacc WatchSource:0}: Error finding container 5f04d98a00301f9b520b51a1fa30412e8df84b3d6965b31f5601199f1f4cdacc: Status 404 returned error can't find the container with id 5f04d98a00301f9b520b51a1fa30412e8df84b3d6965b31f5601199f1f4cdacc Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.647508 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.654685 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.694676 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.878448 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-cj5n5" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.963305 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rll65"] Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.980821 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de8c1882-a305-4f69-adb7-3551b7f8f779-dns-svc\") pod \"de8c1882-a305-4f69-adb7-3551b7f8f779\" (UID: \"de8c1882-a305-4f69-adb7-3551b7f8f779\") " Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.981008 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de8c1882-a305-4f69-adb7-3551b7f8f779-config\") pod \"de8c1882-a305-4f69-adb7-3551b7f8f779\" (UID: \"de8c1882-a305-4f69-adb7-3551b7f8f779\") " Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.981115 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl9ff\" (UniqueName: \"kubernetes.io/projected/de8c1882-a305-4f69-adb7-3551b7f8f779-kube-api-access-wl9ff\") pod \"de8c1882-a305-4f69-adb7-3551b7f8f779\" (UID: \"de8c1882-a305-4f69-adb7-3551b7f8f779\") " Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.986972 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8c1882-a305-4f69-adb7-3551b7f8f779-kube-api-access-wl9ff" (OuterVolumeSpecName: "kube-api-access-wl9ff") pod "de8c1882-a305-4f69-adb7-3551b7f8f779" (UID: "de8c1882-a305-4f69-adb7-3551b7f8f779"). InnerVolumeSpecName "kube-api-access-wl9ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.996591 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de8c1882-a305-4f69-adb7-3551b7f8f779-config" (OuterVolumeSpecName: "config") pod "de8c1882-a305-4f69-adb7-3551b7f8f779" (UID: "de8c1882-a305-4f69-adb7-3551b7f8f779"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:22 crc kubenswrapper[4783]: I0131 09:17:22.998647 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de8c1882-a305-4f69-adb7-3551b7f8f779-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "de8c1882-a305-4f69-adb7-3551b7f8f779" (UID: "de8c1882-a305-4f69-adb7-3551b7f8f779"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.084858 4783 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de8c1882-a305-4f69-adb7-3551b7f8f779-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.084895 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de8c1882-a305-4f69-adb7-3551b7f8f779-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.084908 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl9ff\" (UniqueName: \"kubernetes.io/projected/de8c1882-a305-4f69-adb7-3551b7f8f779-kube-api-access-wl9ff\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.153815 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-k7st6"] Jan 31 09:17:23 crc kubenswrapper[4783]: W0131 09:17:23.167700 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8575964a_bedb_456c_b992_116f66bb7fa2.slice/crio-2793b3dbc41bf6c03200576b9b61250c11e8a0ab8a045ddb279762e02895bc0b WatchSource:0}: Error finding container 2793b3dbc41bf6c03200576b9b61250c11e8a0ab8a045ddb279762e02895bc0b: Status 404 returned error can't find the container with id 2793b3dbc41bf6c03200576b9b61250c11e8a0ab8a045ddb279762e02895bc0b Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.527895 4783 generic.go:334] "Generic (PLEG): container finished" podID="6516f5a8-f446-4407-a048-95068a0b03a0" containerID="1b941c0122b27fec5ff9a9bd5d0223314ac4b0cddeb5863da9db9fd6b6ef1047" exitCode=0 Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.528091 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-4rmlb" event={"ID":"6516f5a8-f446-4407-a048-95068a0b03a0","Type":"ContainerDied","Data":"1b941c0122b27fec5ff9a9bd5d0223314ac4b0cddeb5863da9db9fd6b6ef1047"} Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.531156 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c9a5bd57-8542-4509-a620-c48d2f6c9e06","Type":"ContainerStarted","Data":"6f3e489dba8fa8c06efbb42b2c0324532805ecd901576242ad45ff0a5405bb29"} Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.547294 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" event={"ID":"009682a7-982d-4cc3-9d7f-704c0c7c8d84","Type":"ContainerStarted","Data":"b162037c53c145a86dca751a4edc5a38838c8634caf635406ab3e95a804bc377"} Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.548476 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.553206 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-cj5n5" event={"ID":"de8c1882-a305-4f69-adb7-3551b7f8f779","Type":"ContainerDied","Data":"28de23d4c1c674cd760a0b93322cdad80b4f02e88fb0d751805be846f3fca0ca"} Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.553260 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-cj5n5" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.553298 4783 scope.go:117] "RemoveContainer" containerID="3d6f3807eae9297104d7ce8cc92d2e5feea1b398cf26f1a1aacfdb1a787e9f82" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.565694 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rll65" event={"ID":"ea827790-18ef-4c55-8b5f-365ead9b9f6c","Type":"ContainerStarted","Data":"b0d59ad3043e0ce775344fcb7d63324690cf6fae19d9aaad1a47384ffc90bf45"} Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.570685 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7st6" event={"ID":"8575964a-bedb-456c-b992-116f66bb7fa2","Type":"ContainerStarted","Data":"2793b3dbc41bf6c03200576b9b61250c11e8a0ab8a045ddb279762e02895bc0b"} Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.571061 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" podStartSLOduration=5.019728763 podStartE2EDuration="11.571044282s" podCreationTimestamp="2026-01-31 09:17:12 +0000 UTC" firstStartedPulling="2026-01-31 09:17:15.552729905 +0000 UTC m=+746.221413373" lastFinishedPulling="2026-01-31 09:17:22.104045424 +0000 UTC m=+752.772728892" observedRunningTime="2026-01-31 09:17:23.567470768 +0000 UTC m=+754.236154257" watchObservedRunningTime="2026-01-31 09:17:23.571044282 +0000 UTC m=+754.239727751" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.571911 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d24bc22-990e-4f9f-a39b-f16adc63dfbb","Type":"ContainerStarted","Data":"69b8c1bf328e4446bd4627afbb35c253bb17283816be6385a9658223c6791896"} Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.574018 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e44f3996-11b5-4095-a1f3-e1bc24974386","Type":"ContainerStarted","Data":"4610be82240c0e59547139cbf9cadcc6a98c5ddac8c30e568d2c1057efb0bfe1"} Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.575390 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"03eade59-3312-49be-a51a-9fdcd37f9a33","Type":"ContainerStarted","Data":"5f04d98a00301f9b520b51a1fa30412e8df84b3d6965b31f5601199f1f4cdacc"} Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.582113 4783 generic.go:334] "Generic (PLEG): container finished" podID="80811804-f71b-48dc-873f-8583a6b3e785" containerID="cbbbca7c2dbb876bf9a546a38face610ef10022be9cbddab4da87de1af8ed570" exitCode=0 Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.582231 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-8mf27" event={"ID":"80811804-f71b-48dc-873f-8583a6b3e785","Type":"ContainerDied","Data":"cbbbca7c2dbb876bf9a546a38face610ef10022be9cbddab4da87de1af8ed570"} Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.583814 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c967214f-ce54-4ac2-ae54-2d750133ff97","Type":"ContainerStarted","Data":"9e0cdf5fc23373fcab4c7b0ce62e21ff3be635909a3fc023b8b8802d3d57cd79"} Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.619362 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-cj5n5"] Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.629005 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-cj5n5"] Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.640621 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-fmnfb"] Jan 31 09:17:23 crc kubenswrapper[4783]: E0131 09:17:23.642909 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8c1882-a305-4f69-adb7-3551b7f8f779" containerName="init" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.642933 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8c1882-a305-4f69-adb7-3551b7f8f779" containerName="init" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.643190 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8c1882-a305-4f69-adb7-3551b7f8f779" containerName="init" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.646322 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.648812 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.649390 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.670026 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de8c1882-a305-4f69-adb7-3551b7f8f779" path="/var/lib/kubelet/pods/de8c1882-a305-4f69-adb7-3551b7f8f779/volumes" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.671032 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fmnfb"] Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.802793 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e49f761-fcba-4ec3-9091-61f056e4eb58-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.803008 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e49f761-fcba-4ec3-9091-61f056e4eb58-config\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.803053 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7e49f761-fcba-4ec3-9091-61f056e4eb58-ovs-rundir\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.803217 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7e49f761-fcba-4ec3-9091-61f056e4eb58-ovn-rundir\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.803241 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv4gb\" (UniqueName: \"kubernetes.io/projected/7e49f761-fcba-4ec3-9091-61f056e4eb58-kube-api-access-bv4gb\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.803319 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49f761-fcba-4ec3-9091-61f056e4eb58-combined-ca-bundle\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.850221 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-4rmlb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.905479 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv4gb\" (UniqueName: \"kubernetes.io/projected/7e49f761-fcba-4ec3-9091-61f056e4eb58-kube-api-access-bv4gb\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.906309 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49f761-fcba-4ec3-9091-61f056e4eb58-combined-ca-bundle\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.906440 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e49f761-fcba-4ec3-9091-61f056e4eb58-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.906512 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e49f761-fcba-4ec3-9091-61f056e4eb58-config\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.906578 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7e49f761-fcba-4ec3-9091-61f056e4eb58-ovs-rundir\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.906672 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7e49f761-fcba-4ec3-9091-61f056e4eb58-ovn-rundir\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.906964 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7e49f761-fcba-4ec3-9091-61f056e4eb58-ovn-rundir\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.907311 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7e49f761-fcba-4ec3-9091-61f056e4eb58-ovs-rundir\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.907831 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e49f761-fcba-4ec3-9091-61f056e4eb58-config\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.914213 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e49f761-fcba-4ec3-9091-61f056e4eb58-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.915252 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e49f761-fcba-4ec3-9091-61f056e4eb58-combined-ca-bundle\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.945573 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv4gb\" (UniqueName: \"kubernetes.io/projected/7e49f761-fcba-4ec3-9091-61f056e4eb58-kube-api-access-bv4gb\") pod \"ovn-controller-metrics-fmnfb\" (UID: \"7e49f761-fcba-4ec3-9091-61f056e4eb58\") " pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:23 crc kubenswrapper[4783]: I0131 09:17:23.987494 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fmnfb" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.008122 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w58tt\" (UniqueName: \"kubernetes.io/projected/6516f5a8-f446-4407-a048-95068a0b03a0-kube-api-access-w58tt\") pod \"6516f5a8-f446-4407-a048-95068a0b03a0\" (UID: \"6516f5a8-f446-4407-a048-95068a0b03a0\") " Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.008200 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6516f5a8-f446-4407-a048-95068a0b03a0-config\") pod \"6516f5a8-f446-4407-a048-95068a0b03a0\" (UID: \"6516f5a8-f446-4407-a048-95068a0b03a0\") " Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.018266 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6516f5a8-f446-4407-a048-95068a0b03a0-kube-api-access-w58tt" (OuterVolumeSpecName: "kube-api-access-w58tt") pod "6516f5a8-f446-4407-a048-95068a0b03a0" (UID: "6516f5a8-f446-4407-a048-95068a0b03a0"). InnerVolumeSpecName "kube-api-access-w58tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.029792 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6516f5a8-f446-4407-a048-95068a0b03a0-config" (OuterVolumeSpecName: "config") pod "6516f5a8-f446-4407-a048-95068a0b03a0" (UID: "6516f5a8-f446-4407-a048-95068a0b03a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.097202 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 31 09:17:24 crc kubenswrapper[4783]: E0131 09:17:24.097538 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6516f5a8-f446-4407-a048-95068a0b03a0" containerName="init" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.097555 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="6516f5a8-f446-4407-a048-95068a0b03a0" containerName="init" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.097733 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="6516f5a8-f446-4407-a048-95068a0b03a0" containerName="init" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.098471 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.101847 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.102051 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.102207 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-fcchc" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.102428 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.104979 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.109233 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w58tt\" (UniqueName: \"kubernetes.io/projected/6516f5a8-f446-4407-a048-95068a0b03a0-kube-api-access-w58tt\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.109252 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6516f5a8-f446-4407-a048-95068a0b03a0-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.210549 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bed60581-fb96-4e66-bd14-2e2c0f75a771-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.210850 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bed60581-fb96-4e66-bd14-2e2c0f75a771-config\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.211003 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bed60581-fb96-4e66-bd14-2e2c0f75a771-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.211024 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bed60581-fb96-4e66-bd14-2e2c0f75a771-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.211082 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bed60581-fb96-4e66-bd14-2e2c0f75a771-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.211113 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4gfb\" (UniqueName: \"kubernetes.io/projected/bed60581-fb96-4e66-bd14-2e2c0f75a771-kube-api-access-f4gfb\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.211144 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.211207 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bed60581-fb96-4e66-bd14-2e2c0f75a771-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.312366 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bed60581-fb96-4e66-bd14-2e2c0f75a771-config\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.312473 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bed60581-fb96-4e66-bd14-2e2c0f75a771-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.312514 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bed60581-fb96-4e66-bd14-2e2c0f75a771-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.312540 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bed60581-fb96-4e66-bd14-2e2c0f75a771-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.312582 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4gfb\" (UniqueName: \"kubernetes.io/projected/bed60581-fb96-4e66-bd14-2e2c0f75a771-kube-api-access-f4gfb\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.312599 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.312620 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bed60581-fb96-4e66-bd14-2e2c0f75a771-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.312675 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bed60581-fb96-4e66-bd14-2e2c0f75a771-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.313029 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bed60581-fb96-4e66-bd14-2e2c0f75a771-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.313340 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.313974 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bed60581-fb96-4e66-bd14-2e2c0f75a771-config\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.314064 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bed60581-fb96-4e66-bd14-2e2c0f75a771-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.317307 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bed60581-fb96-4e66-bd14-2e2c0f75a771-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.317605 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bed60581-fb96-4e66-bd14-2e2c0f75a771-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.318127 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bed60581-fb96-4e66-bd14-2e2c0f75a771-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.327460 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4gfb\" (UniqueName: \"kubernetes.io/projected/bed60581-fb96-4e66-bd14-2e2c0f75a771-kube-api-access-f4gfb\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.337033 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bed60581-fb96-4e66-bd14-2e2c0f75a771\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.421865 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.598570 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-4rmlb" event={"ID":"6516f5a8-f446-4407-a048-95068a0b03a0","Type":"ContainerDied","Data":"6c40c81b500fa6d748c3f7357dd01dc0b0287f5212c7849c9c3ed050b30937bc"} Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.598785 4783 scope.go:117] "RemoveContainer" containerID="1b941c0122b27fec5ff9a9bd5d0223314ac4b0cddeb5863da9db9fd6b6ef1047" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.598588 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-4rmlb" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.607039 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-8mf27" event={"ID":"80811804-f71b-48dc-873f-8583a6b3e785","Type":"ContainerStarted","Data":"62049870ea3fb228e13870ed6650c57075fb213366821a7c42a85e34212c81a7"} Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.607115 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-95f5f6995-8mf27" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.632872 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-95f5f6995-8mf27" podStartSLOduration=12.632852023 podStartE2EDuration="12.632852023s" podCreationTimestamp="2026-01-31 09:17:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:17:24.624372793 +0000 UTC m=+755.293056262" watchObservedRunningTime="2026-01-31 09:17:24.632852023 +0000 UTC m=+755.301535490" Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.673570 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-4rmlb"] Jan 31 09:17:24 crc kubenswrapper[4783]: I0131 09:17:24.678456 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-4rmlb"] Jan 31 09:17:25 crc kubenswrapper[4783]: I0131 09:17:25.016464 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fmnfb"] Jan 31 09:17:25 crc kubenswrapper[4783]: I0131 09:17:25.654123 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6516f5a8-f446-4407-a048-95068a0b03a0" path="/var/lib/kubelet/pods/6516f5a8-f446-4407-a048-95068a0b03a0/volumes" Jan 31 09:17:26 crc kubenswrapper[4783]: W0131 09:17:26.176087 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e49f761_fcba_4ec3_9091_61f056e4eb58.slice/crio-101f761e8c7657e06c5caaa5eee7b2b42b158003e0da3f09d20dc42c3dd09ab2 WatchSource:0}: Error finding container 101f761e8c7657e06c5caaa5eee7b2b42b158003e0da3f09d20dc42c3dd09ab2: Status 404 returned error can't find the container with id 101f761e8c7657e06c5caaa5eee7b2b42b158003e0da3f09d20dc42c3dd09ab2 Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.265812 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.267194 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.270924 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-2nd5v" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.272300 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.274870 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.275138 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.277280 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.359648 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e90cb35-3366-44df-9238-3da82d300654-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.359707 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e90cb35-3366-44df-9238-3da82d300654-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.359812 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp2lf\" (UniqueName: \"kubernetes.io/projected/8e90cb35-3366-44df-9238-3da82d300654-kube-api-access-rp2lf\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.359835 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e90cb35-3366-44df-9238-3da82d300654-config\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.359853 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e90cb35-3366-44df-9238-3da82d300654-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.359886 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.359987 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e90cb35-3366-44df-9238-3da82d300654-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.360038 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e90cb35-3366-44df-9238-3da82d300654-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.461468 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e90cb35-3366-44df-9238-3da82d300654-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.461552 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp2lf\" (UniqueName: \"kubernetes.io/projected/8e90cb35-3366-44df-9238-3da82d300654-kube-api-access-rp2lf\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.461571 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e90cb35-3366-44df-9238-3da82d300654-config\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.461587 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e90cb35-3366-44df-9238-3da82d300654-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.461627 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.461652 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e90cb35-3366-44df-9238-3da82d300654-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.461666 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e90cb35-3366-44df-9238-3da82d300654-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.461718 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e90cb35-3366-44df-9238-3da82d300654-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.461895 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.462807 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8e90cb35-3366-44df-9238-3da82d300654-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.463223 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e90cb35-3366-44df-9238-3da82d300654-config\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.463570 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e90cb35-3366-44df-9238-3da82d300654-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.467406 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e90cb35-3366-44df-9238-3da82d300654-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.470394 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e90cb35-3366-44df-9238-3da82d300654-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.470910 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e90cb35-3366-44df-9238-3da82d300654-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.474474 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp2lf\" (UniqueName: \"kubernetes.io/projected/8e90cb35-3366-44df-9238-3da82d300654-kube-api-access-rp2lf\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.478607 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8e90cb35-3366-44df-9238-3da82d300654\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.594201 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:26 crc kubenswrapper[4783]: I0131 09:17:26.628289 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fmnfb" event={"ID":"7e49f761-fcba-4ec3-9091-61f056e4eb58","Type":"ContainerStarted","Data":"101f761e8c7657e06c5caaa5eee7b2b42b158003e0da3f09d20dc42c3dd09ab2"} Jan 31 09:17:27 crc kubenswrapper[4783]: I0131 09:17:27.675446 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" Jan 31 09:17:28 crc kubenswrapper[4783]: I0131 09:17:28.281408 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 31 09:17:30 crc kubenswrapper[4783]: W0131 09:17:30.363603 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbed60581_fb96_4e66_bd14_2e2c0f75a771.slice/crio-f6348e5f6ea92aae06f27dc9fe744da3dbfd14283a89d52fd91a9c07ab46bcce WatchSource:0}: Error finding container f6348e5f6ea92aae06f27dc9fe744da3dbfd14283a89d52fd91a9c07ab46bcce: Status 404 returned error can't find the container with id f6348e5f6ea92aae06f27dc9fe744da3dbfd14283a89d52fd91a9c07ab46bcce Jan 31 09:17:30 crc kubenswrapper[4783]: I0131 09:17:30.668483 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bed60581-fb96-4e66-bd14-2e2c0f75a771","Type":"ContainerStarted","Data":"f6348e5f6ea92aae06f27dc9fe744da3dbfd14283a89d52fd91a9c07ab46bcce"} Jan 31 09:17:31 crc kubenswrapper[4783]: I0131 09:17:31.194887 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 31 09:17:31 crc kubenswrapper[4783]: W0131 09:17:31.202839 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e90cb35_3366_44df_9238_3da82d300654.slice/crio-f7123fd3cf7eadfe66da53172a48551c6fb45fb7ddbfd49e66b5fcdba621bad2 WatchSource:0}: Error finding container f7123fd3cf7eadfe66da53172a48551c6fb45fb7ddbfd49e66b5fcdba621bad2: Status 404 returned error can't find the container with id f7123fd3cf7eadfe66da53172a48551c6fb45fb7ddbfd49e66b5fcdba621bad2 Jan 31 09:17:31 crc kubenswrapper[4783]: I0131 09:17:31.681835 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d24bc22-990e-4f9f-a39b-f16adc63dfbb","Type":"ContainerStarted","Data":"06197717bda753ea4439b9fc74f9a7109ffaae91b7a2da7633d06ef2812f0179"} Jan 31 09:17:31 crc kubenswrapper[4783]: I0131 09:17:31.682188 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 31 09:17:31 crc kubenswrapper[4783]: I0131 09:17:31.684555 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c967214f-ce54-4ac2-ae54-2d750133ff97","Type":"ContainerStarted","Data":"ed8affae1e009424203a64fb78e9f47f8776a0064c654d474b7a767f82a3ef9a"} Jan 31 09:17:31 crc kubenswrapper[4783]: I0131 09:17:31.684686 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 31 09:17:31 crc kubenswrapper[4783]: I0131 09:17:31.685988 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8e90cb35-3366-44df-9238-3da82d300654","Type":"ContainerStarted","Data":"f7123fd3cf7eadfe66da53172a48551c6fb45fb7ddbfd49e66b5fcdba621bad2"} Jan 31 09:17:31 crc kubenswrapper[4783]: I0131 09:17:31.687348 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c9a5bd57-8542-4509-a620-c48d2f6c9e06","Type":"ContainerStarted","Data":"265c6555db0afbca43d9c5d5b66de28fcdf23a54722aa41fdea7f93d8308551f"} Jan 31 09:17:31 crc kubenswrapper[4783]: I0131 09:17:31.688663 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"03eade59-3312-49be-a51a-9fdcd37f9a33","Type":"ContainerStarted","Data":"0280d9d06f0d4d6d9fdbfc8c0d859fe5b6853737046bad0ace9ae4552001d01b"} Jan 31 09:17:31 crc kubenswrapper[4783]: I0131 09:17:31.689966 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7st6" event={"ID":"8575964a-bedb-456c-b992-116f66bb7fa2","Type":"ContainerStarted","Data":"04b28e799610a93e29d104c296d123119ec652e567d6d772c2b6402afec73279"} Jan 31 09:17:31 crc kubenswrapper[4783]: I0131 09:17:31.691557 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fmnfb" event={"ID":"7e49f761-fcba-4ec3-9091-61f056e4eb58","Type":"ContainerStarted","Data":"ee8ca9eb3cde88eab43a1080544ecc74244046fce702ce378a262a9d81e7af7d"} Jan 31 09:17:31 crc kubenswrapper[4783]: I0131 09:17:31.703367 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=5.444846682 podStartE2EDuration="13.703353725s" podCreationTimestamp="2026-01-31 09:17:18 +0000 UTC" firstStartedPulling="2026-01-31 09:17:22.638118783 +0000 UTC m=+753.306802251" lastFinishedPulling="2026-01-31 09:17:30.896625826 +0000 UTC m=+761.565309294" observedRunningTime="2026-01-31 09:17:31.69722229 +0000 UTC m=+762.365905758" watchObservedRunningTime="2026-01-31 09:17:31.703353725 +0000 UTC m=+762.372037193" Jan 31 09:17:31 crc kubenswrapper[4783]: I0131 09:17:31.720310 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-fmnfb" podStartSLOduration=3.9546386890000003 podStartE2EDuration="8.720296904s" podCreationTimestamp="2026-01-31 09:17:23 +0000 UTC" firstStartedPulling="2026-01-31 09:17:26.177853979 +0000 UTC m=+756.846537447" lastFinishedPulling="2026-01-31 09:17:30.943512194 +0000 UTC m=+761.612195662" observedRunningTime="2026-01-31 09:17:31.709847863 +0000 UTC m=+762.378531331" watchObservedRunningTime="2026-01-31 09:17:31.720296904 +0000 UTC m=+762.388980372" Jan 31 09:17:31 crc kubenswrapper[4783]: I0131 09:17:31.794600 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=7.985943565 podStartE2EDuration="15.79457888s" podCreationTimestamp="2026-01-31 09:17:16 +0000 UTC" firstStartedPulling="2026-01-31 09:17:22.620224931 +0000 UTC m=+753.288908399" lastFinishedPulling="2026-01-31 09:17:30.428860246 +0000 UTC m=+761.097543714" observedRunningTime="2026-01-31 09:17:31.791583525 +0000 UTC m=+762.460266993" watchObservedRunningTime="2026-01-31 09:17:31.79457888 +0000 UTC m=+762.463262348" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.031953 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-8mf27"] Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.032425 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-95f5f6995-8mf27" podUID="80811804-f71b-48dc-873f-8583a6b3e785" containerName="dnsmasq-dns" containerID="cri-o://62049870ea3fb228e13870ed6650c57075fb213366821a7c42a85e34212c81a7" gracePeriod=10 Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.035891 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-95f5f6995-8mf27" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.068717 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp"] Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.070302 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.072536 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.096693 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp"] Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.189076 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-ovsdbserver-nb\") pod \"dnsmasq-dns-7bbdc7ccd7-4vpvp\" (UID: \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\") " pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.189203 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fln62\" (UniqueName: \"kubernetes.io/projected/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-kube-api-access-fln62\") pod \"dnsmasq-dns-7bbdc7ccd7-4vpvp\" (UID: \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\") " pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.189256 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-dns-svc\") pod \"dnsmasq-dns-7bbdc7ccd7-4vpvp\" (UID: \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\") " pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.189289 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-config\") pod \"dnsmasq-dns-7bbdc7ccd7-4vpvp\" (UID: \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\") " pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.228641 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp"] Jan 31 09:17:32 crc kubenswrapper[4783]: E0131 09:17:32.229534 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-fln62 ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" podUID="f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.256528 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-mtrfn"] Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.258073 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.259695 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.270651 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-mtrfn"] Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.291065 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fln62\" (UniqueName: \"kubernetes.io/projected/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-kube-api-access-fln62\") pod \"dnsmasq-dns-7bbdc7ccd7-4vpvp\" (UID: \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\") " pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.291111 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-dns-svc\") pod \"dnsmasq-dns-7bbdc7ccd7-4vpvp\" (UID: \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\") " pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.291130 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-config\") pod \"dnsmasq-dns-7bbdc7ccd7-4vpvp\" (UID: \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\") " pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.291212 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-ovsdbserver-nb\") pod \"dnsmasq-dns-7bbdc7ccd7-4vpvp\" (UID: \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\") " pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.292080 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-dns-svc\") pod \"dnsmasq-dns-7bbdc7ccd7-4vpvp\" (UID: \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\") " pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.292249 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-ovsdbserver-nb\") pod \"dnsmasq-dns-7bbdc7ccd7-4vpvp\" (UID: \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\") " pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.292681 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-config\") pod \"dnsmasq-dns-7bbdc7ccd7-4vpvp\" (UID: \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\") " pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.306750 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fln62\" (UniqueName: \"kubernetes.io/projected/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-kube-api-access-fln62\") pod \"dnsmasq-dns-7bbdc7ccd7-4vpvp\" (UID: \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\") " pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.392958 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-mtrfn\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.393122 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-mtrfn\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.393273 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7scn\" (UniqueName: \"kubernetes.io/projected/035c0867-e094-48c8-a511-d910357df9ff-kube-api-access-x7scn\") pod \"dnsmasq-dns-757dc6fff9-mtrfn\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.393362 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-config\") pod \"dnsmasq-dns-757dc6fff9-mtrfn\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.393425 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-mtrfn\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.494894 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7scn\" (UniqueName: \"kubernetes.io/projected/035c0867-e094-48c8-a511-d910357df9ff-kube-api-access-x7scn\") pod \"dnsmasq-dns-757dc6fff9-mtrfn\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.495123 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-config\") pod \"dnsmasq-dns-757dc6fff9-mtrfn\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.495196 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-mtrfn\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.495284 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-mtrfn\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.495323 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-mtrfn\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.495928 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-config\") pod \"dnsmasq-dns-757dc6fff9-mtrfn\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.495965 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-mtrfn\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.496070 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-mtrfn\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.496142 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-mtrfn\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.509212 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7scn\" (UniqueName: \"kubernetes.io/projected/035c0867-e094-48c8-a511-d910357df9ff-kube-api-access-x7scn\") pod \"dnsmasq-dns-757dc6fff9-mtrfn\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.583059 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.627876 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-8mf27" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.708544 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bed60581-fb96-4e66-bd14-2e2c0f75a771","Type":"ContainerStarted","Data":"60756dfdd968c5db9de6f096b2305fe42112e6dbec4d16d296c828f8969488dc"} Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.708745 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bed60581-fb96-4e66-bd14-2e2c0f75a771","Type":"ContainerStarted","Data":"2c73cc616fc0c8438ff42040a9889741cc76e72224846a5e3c631edc95825ee8"} Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.714667 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rll65" event={"ID":"ea827790-18ef-4c55-8b5f-365ead9b9f6c","Type":"ContainerStarted","Data":"cc19f80aa594c191d8c1e27fc7a3edc4cae8920277a328bece47649653c386a2"} Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.714767 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-rll65" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.716030 4783 generic.go:334] "Generic (PLEG): container finished" podID="8575964a-bedb-456c-b992-116f66bb7fa2" containerID="04b28e799610a93e29d104c296d123119ec652e567d6d772c2b6402afec73279" exitCode=0 Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.716084 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7st6" event={"ID":"8575964a-bedb-456c-b992-116f66bb7fa2","Type":"ContainerDied","Data":"04b28e799610a93e29d104c296d123119ec652e567d6d772c2b6402afec73279"} Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.719811 4783 generic.go:334] "Generic (PLEG): container finished" podID="80811804-f71b-48dc-873f-8583a6b3e785" containerID="62049870ea3fb228e13870ed6650c57075fb213366821a7c42a85e34212c81a7" exitCode=0 Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.719872 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-8mf27" event={"ID":"80811804-f71b-48dc-873f-8583a6b3e785","Type":"ContainerDied","Data":"62049870ea3fb228e13870ed6650c57075fb213366821a7c42a85e34212c81a7"} Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.719919 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-8mf27" event={"ID":"80811804-f71b-48dc-873f-8583a6b3e785","Type":"ContainerDied","Data":"07f9ac3622a49f292d8093687055660872f566c20da13d39a4992eefb7d25178"} Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.719933 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-8mf27" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.719984 4783 scope.go:117] "RemoveContainer" containerID="62049870ea3fb228e13870ed6650c57075fb213366821a7c42a85e34212c81a7" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.722375 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1aa1eeb1-d389-4933-a40b-3383b28597c2","Type":"ContainerStarted","Data":"e2b77e2267fe204a89c641d60d263440188dd11b1536b9062ded3cf26ce92800"} Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.735291 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.735285 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e44f3996-11b5-4095-a1f3-e1bc24974386","Type":"ContainerStarted","Data":"e7e18d5ab9b16321ee2a0c8b2935712bc5ad499d23d1ae5dc9633f021df16c76"} Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.736488 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.197546064 podStartE2EDuration="9.73647731s" podCreationTimestamp="2026-01-31 09:17:23 +0000 UTC" firstStartedPulling="2026-01-31 09:17:30.386888889 +0000 UTC m=+761.055572358" lastFinishedPulling="2026-01-31 09:17:31.925820136 +0000 UTC m=+762.594503604" observedRunningTime="2026-01-31 09:17:32.730249864 +0000 UTC m=+763.398933332" watchObservedRunningTime="2026-01-31 09:17:32.73647731 +0000 UTC m=+763.405160778" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.742666 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.754782 4783 scope.go:117] "RemoveContainer" containerID="cbbbca7c2dbb876bf9a546a38face610ef10022be9cbddab4da87de1af8ed570" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.780519 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-rll65" podStartSLOduration=2.84756924 podStartE2EDuration="10.780500372s" podCreationTimestamp="2026-01-31 09:17:22 +0000 UTC" firstStartedPulling="2026-01-31 09:17:22.9635807 +0000 UTC m=+753.632264167" lastFinishedPulling="2026-01-31 09:17:30.896511831 +0000 UTC m=+761.565195299" observedRunningTime="2026-01-31 09:17:32.779976324 +0000 UTC m=+763.448659792" watchObservedRunningTime="2026-01-31 09:17:32.780500372 +0000 UTC m=+763.449183840" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.786690 4783 scope.go:117] "RemoveContainer" containerID="62049870ea3fb228e13870ed6650c57075fb213366821a7c42a85e34212c81a7" Jan 31 09:17:32 crc kubenswrapper[4783]: E0131 09:17:32.787142 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62049870ea3fb228e13870ed6650c57075fb213366821a7c42a85e34212c81a7\": container with ID starting with 62049870ea3fb228e13870ed6650c57075fb213366821a7c42a85e34212c81a7 not found: ID does not exist" containerID="62049870ea3fb228e13870ed6650c57075fb213366821a7c42a85e34212c81a7" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.787228 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62049870ea3fb228e13870ed6650c57075fb213366821a7c42a85e34212c81a7"} err="failed to get container status \"62049870ea3fb228e13870ed6650c57075fb213366821a7c42a85e34212c81a7\": rpc error: code = NotFound desc = could not find container \"62049870ea3fb228e13870ed6650c57075fb213366821a7c42a85e34212c81a7\": container with ID starting with 62049870ea3fb228e13870ed6650c57075fb213366821a7c42a85e34212c81a7 not found: ID does not exist" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.787252 4783 scope.go:117] "RemoveContainer" containerID="cbbbca7c2dbb876bf9a546a38face610ef10022be9cbddab4da87de1af8ed570" Jan 31 09:17:32 crc kubenswrapper[4783]: E0131 09:17:32.787483 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbbbca7c2dbb876bf9a546a38face610ef10022be9cbddab4da87de1af8ed570\": container with ID starting with cbbbca7c2dbb876bf9a546a38face610ef10022be9cbddab4da87de1af8ed570 not found: ID does not exist" containerID="cbbbca7c2dbb876bf9a546a38face610ef10022be9cbddab4da87de1af8ed570" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.787499 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbbbca7c2dbb876bf9a546a38face610ef10022be9cbddab4da87de1af8ed570"} err="failed to get container status \"cbbbca7c2dbb876bf9a546a38face610ef10022be9cbddab4da87de1af8ed570\": rpc error: code = NotFound desc = could not find container \"cbbbca7c2dbb876bf9a546a38face610ef10022be9cbddab4da87de1af8ed570\": container with ID starting with cbbbca7c2dbb876bf9a546a38face610ef10022be9cbddab4da87de1af8ed570 not found: ID does not exist" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.800990 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80811804-f71b-48dc-873f-8583a6b3e785-config\") pod \"80811804-f71b-48dc-873f-8583a6b3e785\" (UID: \"80811804-f71b-48dc-873f-8583a6b3e785\") " Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.801117 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80811804-f71b-48dc-873f-8583a6b3e785-dns-svc\") pod \"80811804-f71b-48dc-873f-8583a6b3e785\" (UID: \"80811804-f71b-48dc-873f-8583a6b3e785\") " Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.801241 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6v5c\" (UniqueName: \"kubernetes.io/projected/80811804-f71b-48dc-873f-8583a6b3e785-kube-api-access-f6v5c\") pod \"80811804-f71b-48dc-873f-8583a6b3e785\" (UID: \"80811804-f71b-48dc-873f-8583a6b3e785\") " Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.808868 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80811804-f71b-48dc-873f-8583a6b3e785-kube-api-access-f6v5c" (OuterVolumeSpecName: "kube-api-access-f6v5c") pod "80811804-f71b-48dc-873f-8583a6b3e785" (UID: "80811804-f71b-48dc-873f-8583a6b3e785"). InnerVolumeSpecName "kube-api-access-f6v5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.834507 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80811804-f71b-48dc-873f-8583a6b3e785-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "80811804-f71b-48dc-873f-8583a6b3e785" (UID: "80811804-f71b-48dc-873f-8583a6b3e785"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.842585 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80811804-f71b-48dc-873f-8583a6b3e785-config" (OuterVolumeSpecName: "config") pod "80811804-f71b-48dc-873f-8583a6b3e785" (UID: "80811804-f71b-48dc-873f-8583a6b3e785"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.902701 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-config\") pod \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\" (UID: \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\") " Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.902827 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-dns-svc\") pod \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\" (UID: \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\") " Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.902888 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-ovsdbserver-nb\") pod \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\" (UID: \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\") " Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.902915 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fln62\" (UniqueName: \"kubernetes.io/projected/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-kube-api-access-fln62\") pod \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\" (UID: \"f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7\") " Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.903817 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-config" (OuterVolumeSpecName: "config") pod "f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7" (UID: "f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.904133 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7" (UID: "f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.904474 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7" (UID: "f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.906915 4783 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80811804-f71b-48dc-873f-8583a6b3e785-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.906944 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6v5c\" (UniqueName: \"kubernetes.io/projected/80811804-f71b-48dc-873f-8583a6b3e785-kube-api-access-f6v5c\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.906958 4783 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.906968 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80811804-f71b-48dc-873f-8583a6b3e785-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.907003 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:32 crc kubenswrapper[4783]: I0131 09:17:32.909953 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-kube-api-access-fln62" (OuterVolumeSpecName: "kube-api-access-fln62") pod "f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7" (UID: "f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7"). InnerVolumeSpecName "kube-api-access-fln62". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.008881 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.009238 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fln62\" (UniqueName: \"kubernetes.io/projected/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7-kube-api-access-fln62\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.035864 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-mtrfn"] Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.048896 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-8mf27"] Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.052480 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-8mf27"] Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.422476 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.654760 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80811804-f71b-48dc-873f-8583a6b3e785" path="/var/lib/kubelet/pods/80811804-f71b-48dc-873f-8583a6b3e785/volumes" Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.748001 4783 generic.go:334] "Generic (PLEG): container finished" podID="035c0867-e094-48c8-a511-d910357df9ff" containerID="8535c67753fbdea70cb90f11e6278ccae7c11b06109179accedc05b468ffbba7" exitCode=0 Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.748039 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" event={"ID":"035c0867-e094-48c8-a511-d910357df9ff","Type":"ContainerDied","Data":"8535c67753fbdea70cb90f11e6278ccae7c11b06109179accedc05b468ffbba7"} Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.748083 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" event={"ID":"035c0867-e094-48c8-a511-d910357df9ff","Type":"ContainerStarted","Data":"dacf417e411aafbdae288d5e0f495a96f10cbed5aacbd056c88de47047b662b2"} Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.751065 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7st6" event={"ID":"8575964a-bedb-456c-b992-116f66bb7fa2","Type":"ContainerStarted","Data":"ef5cb45794444286abbc9886a8d4fb4fa04d176490111f74259fdb54b29b62fe"} Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.751126 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7st6" event={"ID":"8575964a-bedb-456c-b992-116f66bb7fa2","Type":"ContainerStarted","Data":"2da6f5024b8060eab6d440fa5c662a05ec1ac6d1cc36e91a01e76f3b43833673"} Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.751391 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.751490 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.757441 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8e90cb35-3366-44df-9238-3da82d300654","Type":"ContainerStarted","Data":"b6cee645b2d7d2e6f3abab13d50dfc7a07541f060e0ebc4b62c8564f0a66af5b"} Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.757476 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8e90cb35-3366-44df-9238-3da82d300654","Type":"ContainerStarted","Data":"a0460bfbc1e48010ef8f850228ab8c517e535040706b087154fd69c1a41fabf5"} Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.758967 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp" Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.790789 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-k7st6" podStartSLOduration=4.065531826 podStartE2EDuration="11.790773314s" podCreationTimestamp="2026-01-31 09:17:22 +0000 UTC" firstStartedPulling="2026-01-31 09:17:23.171464328 +0000 UTC m=+753.840147797" lastFinishedPulling="2026-01-31 09:17:30.896705817 +0000 UTC m=+761.565389285" observedRunningTime="2026-01-31 09:17:33.788320101 +0000 UTC m=+764.457003569" watchObservedRunningTime="2026-01-31 09:17:33.790773314 +0000 UTC m=+764.459456782" Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.807842 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=7.454619574 podStartE2EDuration="8.807826471s" podCreationTimestamp="2026-01-31 09:17:25 +0000 UTC" firstStartedPulling="2026-01-31 09:17:31.207673869 +0000 UTC m=+761.876357337" lastFinishedPulling="2026-01-31 09:17:32.560880765 +0000 UTC m=+763.229564234" observedRunningTime="2026-01-31 09:17:33.80119277 +0000 UTC m=+764.469876238" watchObservedRunningTime="2026-01-31 09:17:33.807826471 +0000 UTC m=+764.476509939" Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.830439 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp"] Jan 31 09:17:33 crc kubenswrapper[4783]: I0131 09:17:33.834090 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bbdc7ccd7-4vpvp"] Jan 31 09:17:34 crc kubenswrapper[4783]: I0131 09:17:34.422962 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:34 crc kubenswrapper[4783]: I0131 09:17:34.764885 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" event={"ID":"035c0867-e094-48c8-a511-d910357df9ff","Type":"ContainerStarted","Data":"a39ec6180b8fbfcdecefa0d9cd49637130d6abd1869820dbca97f777fa9b0f84"} Jan 31 09:17:34 crc kubenswrapper[4783]: I0131 09:17:34.765553 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:34 crc kubenswrapper[4783]: I0131 09:17:34.766599 4783 generic.go:334] "Generic (PLEG): container finished" podID="c9a5bd57-8542-4509-a620-c48d2f6c9e06" containerID="265c6555db0afbca43d9c5d5b66de28fcdf23a54722aa41fdea7f93d8308551f" exitCode=0 Jan 31 09:17:34 crc kubenswrapper[4783]: I0131 09:17:34.766648 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c9a5bd57-8542-4509-a620-c48d2f6c9e06","Type":"ContainerDied","Data":"265c6555db0afbca43d9c5d5b66de28fcdf23a54722aa41fdea7f93d8308551f"} Jan 31 09:17:34 crc kubenswrapper[4783]: I0131 09:17:34.768087 4783 generic.go:334] "Generic (PLEG): container finished" podID="03eade59-3312-49be-a51a-9fdcd37f9a33" containerID="0280d9d06f0d4d6d9fdbfc8c0d859fe5b6853737046bad0ace9ae4552001d01b" exitCode=0 Jan 31 09:17:34 crc kubenswrapper[4783]: I0131 09:17:34.768240 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"03eade59-3312-49be-a51a-9fdcd37f9a33","Type":"ContainerDied","Data":"0280d9d06f0d4d6d9fdbfc8c0d859fe5b6853737046bad0ace9ae4552001d01b"} Jan 31 09:17:34 crc kubenswrapper[4783]: I0131 09:17:34.785113 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" podStartSLOduration=2.78510212 podStartE2EDuration="2.78510212s" podCreationTimestamp="2026-01-31 09:17:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:17:34.781429609 +0000 UTC m=+765.450113077" watchObservedRunningTime="2026-01-31 09:17:34.78510212 +0000 UTC m=+765.453785588" Jan 31 09:17:35 crc kubenswrapper[4783]: I0131 09:17:35.594570 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:35 crc kubenswrapper[4783]: I0131 09:17:35.627933 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:35 crc kubenswrapper[4783]: I0131 09:17:35.654245 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7" path="/var/lib/kubelet/pods/f6bcc6ff-1c3c-4bd0-8b31-8d6c631252b7/volumes" Jan 31 09:17:35 crc kubenswrapper[4783]: I0131 09:17:35.778433 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c9a5bd57-8542-4509-a620-c48d2f6c9e06","Type":"ContainerStarted","Data":"fd4720d821e87ffe9d7c34686fc3dc154b8070dbadef2b2cdb4ab3000b3a5d23"} Jan 31 09:17:35 crc kubenswrapper[4783]: I0131 09:17:35.780297 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"03eade59-3312-49be-a51a-9fdcd37f9a33","Type":"ContainerStarted","Data":"db34132f75b7276b01f7f41b933dedd30f90fa1e7478e337bc366dc80c1bdeca"} Jan 31 09:17:35 crc kubenswrapper[4783]: I0131 09:17:35.780950 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:35 crc kubenswrapper[4783]: I0131 09:17:35.802467 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=12.695028714 podStartE2EDuration="20.802441339s" podCreationTimestamp="2026-01-31 09:17:15 +0000 UTC" firstStartedPulling="2026-01-31 09:17:22.789207921 +0000 UTC m=+753.457891389" lastFinishedPulling="2026-01-31 09:17:30.896620546 +0000 UTC m=+761.565304014" observedRunningTime="2026-01-31 09:17:35.794762296 +0000 UTC m=+766.463445765" watchObservedRunningTime="2026-01-31 09:17:35.802441339 +0000 UTC m=+766.471124806" Jan 31 09:17:35 crc kubenswrapper[4783]: I0131 09:17:35.819859 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=14.589471379 podStartE2EDuration="22.819832622s" podCreationTimestamp="2026-01-31 09:17:13 +0000 UTC" firstStartedPulling="2026-01-31 09:17:22.667561928 +0000 UTC m=+753.336245396" lastFinishedPulling="2026-01-31 09:17:30.897923171 +0000 UTC m=+761.566606639" observedRunningTime="2026-01-31 09:17:35.814660505 +0000 UTC m=+766.483343974" watchObservedRunningTime="2026-01-31 09:17:35.819832622 +0000 UTC m=+766.488516090" Jan 31 09:17:36 crc kubenswrapper[4783]: I0131 09:17:36.449760 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:36 crc kubenswrapper[4783]: I0131 09:17:36.631753 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:36 crc kubenswrapper[4783]: I0131 09:17:36.631833 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:37 crc kubenswrapper[4783]: I0131 09:17:37.094463 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 31 09:17:38 crc kubenswrapper[4783]: I0131 09:17:38.877005 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 31 09:17:38 crc kubenswrapper[4783]: I0131 09:17:38.932900 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-mtrfn"] Jan 31 09:17:38 crc kubenswrapper[4783]: I0131 09:17:38.933470 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" podUID="035c0867-e094-48c8-a511-d910357df9ff" containerName="dnsmasq-dns" containerID="cri-o://a39ec6180b8fbfcdecefa0d9cd49637130d6abd1869820dbca97f777fa9b0f84" gracePeriod=10 Jan 31 09:17:38 crc kubenswrapper[4783]: I0131 09:17:38.959109 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-99hbj"] Jan 31 09:17:38 crc kubenswrapper[4783]: E0131 09:17:38.959565 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80811804-f71b-48dc-873f-8583a6b3e785" containerName="init" Jan 31 09:17:38 crc kubenswrapper[4783]: I0131 09:17:38.959578 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="80811804-f71b-48dc-873f-8583a6b3e785" containerName="init" Jan 31 09:17:38 crc kubenswrapper[4783]: E0131 09:17:38.959600 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80811804-f71b-48dc-873f-8583a6b3e785" containerName="dnsmasq-dns" Jan 31 09:17:38 crc kubenswrapper[4783]: I0131 09:17:38.959606 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="80811804-f71b-48dc-873f-8583a6b3e785" containerName="dnsmasq-dns" Jan 31 09:17:38 crc kubenswrapper[4783]: I0131 09:17:38.959743 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="80811804-f71b-48dc-873f-8583a6b3e785" containerName="dnsmasq-dns" Jan 31 09:17:38 crc kubenswrapper[4783]: I0131 09:17:38.960596 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:38 crc kubenswrapper[4783]: I0131 09:17:38.975036 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-99hbj"] Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.014054 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-99hbj\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.014116 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-99hbj\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.014151 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c682h\" (UniqueName: \"kubernetes.io/projected/3de91956-c9f6-4dda-ab39-1bb28e7b16de-kube-api-access-c682h\") pod \"dnsmasq-dns-6cb545bd4c-99hbj\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.014189 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-config\") pod \"dnsmasq-dns-6cb545bd4c-99hbj\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.014211 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-99hbj\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.115650 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-config\") pod \"dnsmasq-dns-6cb545bd4c-99hbj\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.115720 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-99hbj\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.115879 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-99hbj\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.115960 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-99hbj\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.116020 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c682h\" (UniqueName: \"kubernetes.io/projected/3de91956-c9f6-4dda-ab39-1bb28e7b16de-kube-api-access-c682h\") pod \"dnsmasq-dns-6cb545bd4c-99hbj\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.116502 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-config\") pod \"dnsmasq-dns-6cb545bd4c-99hbj\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.117063 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-99hbj\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.117458 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-99hbj\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.118012 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-99hbj\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.134792 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c682h\" (UniqueName: \"kubernetes.io/projected/3de91956-c9f6-4dda-ab39-1bb28e7b16de-kube-api-access-c682h\") pod \"dnsmasq-dns-6cb545bd4c-99hbj\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.298689 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.353547 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.421119 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-dns-svc\") pod \"035c0867-e094-48c8-a511-d910357df9ff\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.421253 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-ovsdbserver-nb\") pod \"035c0867-e094-48c8-a511-d910357df9ff\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.421433 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-ovsdbserver-sb\") pod \"035c0867-e094-48c8-a511-d910357df9ff\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.421495 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7scn\" (UniqueName: \"kubernetes.io/projected/035c0867-e094-48c8-a511-d910357df9ff-kube-api-access-x7scn\") pod \"035c0867-e094-48c8-a511-d910357df9ff\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.421526 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-config\") pod \"035c0867-e094-48c8-a511-d910357df9ff\" (UID: \"035c0867-e094-48c8-a511-d910357df9ff\") " Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.431976 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/035c0867-e094-48c8-a511-d910357df9ff-kube-api-access-x7scn" (OuterVolumeSpecName: "kube-api-access-x7scn") pod "035c0867-e094-48c8-a511-d910357df9ff" (UID: "035c0867-e094-48c8-a511-d910357df9ff"). InnerVolumeSpecName "kube-api-access-x7scn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.457853 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "035c0867-e094-48c8-a511-d910357df9ff" (UID: "035c0867-e094-48c8-a511-d910357df9ff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.461392 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.470575 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "035c0867-e094-48c8-a511-d910357df9ff" (UID: "035c0867-e094-48c8-a511-d910357df9ff"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.471770 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-config" (OuterVolumeSpecName: "config") pod "035c0867-e094-48c8-a511-d910357df9ff" (UID: "035c0867-e094-48c8-a511-d910357df9ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.481223 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "035c0867-e094-48c8-a511-d910357df9ff" (UID: "035c0867-e094-48c8-a511-d910357df9ff"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.524089 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.524292 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7scn\" (UniqueName: \"kubernetes.io/projected/035c0867-e094-48c8-a511-d910357df9ff-kube-api-access-x7scn\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.524463 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.524496 4783 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.524506 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/035c0867-e094-48c8-a511-d910357df9ff-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.701653 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-99hbj"] Jan 31 09:17:39 crc kubenswrapper[4783]: W0131 09:17:39.703950 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3de91956_c9f6_4dda_ab39_1bb28e7b16de.slice/crio-67e59e16b242b96668036e80e8f6aa10d6665c1c6773e628e7037777b006ffae WatchSource:0}: Error finding container 67e59e16b242b96668036e80e8f6aa10d6665c1c6773e628e7037777b006ffae: Status 404 returned error can't find the container with id 67e59e16b242b96668036e80e8f6aa10d6665c1c6773e628e7037777b006ffae Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.806883 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" event={"ID":"3de91956-c9f6-4dda-ab39-1bb28e7b16de","Type":"ContainerStarted","Data":"67e59e16b242b96668036e80e8f6aa10d6665c1c6773e628e7037777b006ffae"} Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.808743 4783 generic.go:334] "Generic (PLEG): container finished" podID="035c0867-e094-48c8-a511-d910357df9ff" containerID="a39ec6180b8fbfcdecefa0d9cd49637130d6abd1869820dbca97f777fa9b0f84" exitCode=0 Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.808798 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" event={"ID":"035c0867-e094-48c8-a511-d910357df9ff","Type":"ContainerDied","Data":"a39ec6180b8fbfcdecefa0d9cd49637130d6abd1869820dbca97f777fa9b0f84"} Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.808836 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.808858 4783 scope.go:117] "RemoveContainer" containerID="a39ec6180b8fbfcdecefa0d9cd49637130d6abd1869820dbca97f777fa9b0f84" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.808844 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-mtrfn" event={"ID":"035c0867-e094-48c8-a511-d910357df9ff","Type":"ContainerDied","Data":"dacf417e411aafbdae288d5e0f495a96f10cbed5aacbd056c88de47047b662b2"} Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.826321 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-mtrfn"] Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.828310 4783 scope.go:117] "RemoveContainer" containerID="8535c67753fbdea70cb90f11e6278ccae7c11b06109179accedc05b468ffbba7" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.832182 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-mtrfn"] Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.843394 4783 scope.go:117] "RemoveContainer" containerID="a39ec6180b8fbfcdecefa0d9cd49637130d6abd1869820dbca97f777fa9b0f84" Jan 31 09:17:39 crc kubenswrapper[4783]: E0131 09:17:39.843886 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a39ec6180b8fbfcdecefa0d9cd49637130d6abd1869820dbca97f777fa9b0f84\": container with ID starting with a39ec6180b8fbfcdecefa0d9cd49637130d6abd1869820dbca97f777fa9b0f84 not found: ID does not exist" containerID="a39ec6180b8fbfcdecefa0d9cd49637130d6abd1869820dbca97f777fa9b0f84" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.843923 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a39ec6180b8fbfcdecefa0d9cd49637130d6abd1869820dbca97f777fa9b0f84"} err="failed to get container status \"a39ec6180b8fbfcdecefa0d9cd49637130d6abd1869820dbca97f777fa9b0f84\": rpc error: code = NotFound desc = could not find container \"a39ec6180b8fbfcdecefa0d9cd49637130d6abd1869820dbca97f777fa9b0f84\": container with ID starting with a39ec6180b8fbfcdecefa0d9cd49637130d6abd1869820dbca97f777fa9b0f84 not found: ID does not exist" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.843945 4783 scope.go:117] "RemoveContainer" containerID="8535c67753fbdea70cb90f11e6278ccae7c11b06109179accedc05b468ffbba7" Jan 31 09:17:39 crc kubenswrapper[4783]: E0131 09:17:39.844300 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8535c67753fbdea70cb90f11e6278ccae7c11b06109179accedc05b468ffbba7\": container with ID starting with 8535c67753fbdea70cb90f11e6278ccae7c11b06109179accedc05b468ffbba7 not found: ID does not exist" containerID="8535c67753fbdea70cb90f11e6278ccae7c11b06109179accedc05b468ffbba7" Jan 31 09:17:39 crc kubenswrapper[4783]: I0131 09:17:39.844325 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8535c67753fbdea70cb90f11e6278ccae7c11b06109179accedc05b468ffbba7"} err="failed to get container status \"8535c67753fbdea70cb90f11e6278ccae7c11b06109179accedc05b468ffbba7\": rpc error: code = NotFound desc = could not find container \"8535c67753fbdea70cb90f11e6278ccae7c11b06109179accedc05b468ffbba7\": container with ID starting with 8535c67753fbdea70cb90f11e6278ccae7c11b06109179accedc05b468ffbba7 not found: ID does not exist" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.053851 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 31 09:17:40 crc kubenswrapper[4783]: E0131 09:17:40.054141 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035c0867-e094-48c8-a511-d910357df9ff" containerName="dnsmasq-dns" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.054152 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="035c0867-e094-48c8-a511-d910357df9ff" containerName="dnsmasq-dns" Jan 31 09:17:40 crc kubenswrapper[4783]: E0131 09:17:40.054185 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035c0867-e094-48c8-a511-d910357df9ff" containerName="init" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.054190 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="035c0867-e094-48c8-a511-d910357df9ff" containerName="init" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.054324 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="035c0867-e094-48c8-a511-d910357df9ff" containerName="dnsmasq-dns" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.060266 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.061887 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.061892 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.061999 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.062470 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-58d9k" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.076244 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.132049 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c78f0039-d432-4056-a572-d3049488bb75-cache\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.132118 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nhdt\" (UniqueName: \"kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-kube-api-access-8nhdt\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.132146 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c78f0039-d432-4056-a572-d3049488bb75-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.132199 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.132217 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c78f0039-d432-4056-a572-d3049488bb75-lock\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.132412 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.233907 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.233976 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c78f0039-d432-4056-a572-d3049488bb75-cache\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.234037 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nhdt\" (UniqueName: \"kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-kube-api-access-8nhdt\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.234067 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c78f0039-d432-4056-a572-d3049488bb75-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.234109 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.234131 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c78f0039-d432-4056-a572-d3049488bb75-lock\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: E0131 09:17:40.234156 4783 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:17:40 crc kubenswrapper[4783]: E0131 09:17:40.234204 4783 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:17:40 crc kubenswrapper[4783]: E0131 09:17:40.234271 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift podName:c78f0039-d432-4056-a572-d3049488bb75 nodeName:}" failed. No retries permitted until 2026-01-31 09:17:40.734247404 +0000 UTC m=+771.402930872 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift") pod "swift-storage-0" (UID: "c78f0039-d432-4056-a572-d3049488bb75") : configmap "swift-ring-files" not found Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.234458 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.234951 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c78f0039-d432-4056-a572-d3049488bb75-cache\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.235055 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c78f0039-d432-4056-a572-d3049488bb75-lock\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.240011 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c78f0039-d432-4056-a572-d3049488bb75-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.247992 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nhdt\" (UniqueName: \"kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-kube-api-access-8nhdt\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.252143 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.548215 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-58vfp"] Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.549100 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.550563 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.550695 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.550973 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.565100 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-58vfp"] Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.644813 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6827ccb1-8fcf-4451-a878-25d3d5765ae6-etc-swift\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.644907 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-dispersionconf\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.644945 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-combined-ca-bundle\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.644983 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfk6p\" (UniqueName: \"kubernetes.io/projected/6827ccb1-8fcf-4451-a878-25d3d5765ae6-kube-api-access-jfk6p\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.645011 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-swiftconf\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.645085 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6827ccb1-8fcf-4451-a878-25d3d5765ae6-ring-data-devices\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.645115 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6827ccb1-8fcf-4451-a878-25d3d5765ae6-scripts\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.697509 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.746108 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfk6p\" (UniqueName: \"kubernetes.io/projected/6827ccb1-8fcf-4451-a878-25d3d5765ae6-kube-api-access-jfk6p\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.746183 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-swiftconf\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.746226 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6827ccb1-8fcf-4451-a878-25d3d5765ae6-ring-data-devices\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.746282 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6827ccb1-8fcf-4451-a878-25d3d5765ae6-scripts\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.746369 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6827ccb1-8fcf-4451-a878-25d3d5765ae6-etc-swift\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.746419 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.746454 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-dispersionconf\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.746478 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-combined-ca-bundle\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: E0131 09:17:40.746826 4783 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:17:40 crc kubenswrapper[4783]: E0131 09:17:40.746866 4783 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:17:40 crc kubenswrapper[4783]: E0131 09:17:40.746923 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift podName:c78f0039-d432-4056-a572-d3049488bb75 nodeName:}" failed. No retries permitted until 2026-01-31 09:17:41.74690284 +0000 UTC m=+772.415586307 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift") pod "swift-storage-0" (UID: "c78f0039-d432-4056-a572-d3049488bb75") : configmap "swift-ring-files" not found Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.747356 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6827ccb1-8fcf-4451-a878-25d3d5765ae6-etc-swift\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.747976 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6827ccb1-8fcf-4451-a878-25d3d5765ae6-ring-data-devices\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.748388 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6827ccb1-8fcf-4451-a878-25d3d5765ae6-scripts\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.751100 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-swiftconf\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.751350 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-dispersionconf\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.751452 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-combined-ca-bundle\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.760358 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.762069 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfk6p\" (UniqueName: \"kubernetes.io/projected/6827ccb1-8fcf-4451-a878-25d3d5765ae6-kube-api-access-jfk6p\") pod \"swift-ring-rebalance-58vfp\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.815822 4783 generic.go:334] "Generic (PLEG): container finished" podID="3de91956-c9f6-4dda-ab39-1bb28e7b16de" containerID="d54082396eb8aa054fd23b440ea190b69885497d0d2a891415cce534c8ff62d2" exitCode=0 Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.815899 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" event={"ID":"3de91956-c9f6-4dda-ab39-1bb28e7b16de","Type":"ContainerDied","Data":"d54082396eb8aa054fd23b440ea190b69885497d0d2a891415cce534c8ff62d2"} Jan 31 09:17:40 crc kubenswrapper[4783]: I0131 09:17:40.872234 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.243539 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-58vfp"] Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.252036 4783 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.627254 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.654230 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="035c0867-e094-48c8-a511-d910357df9ff" path="/var/lib/kubelet/pods/035c0867-e094-48c8-a511-d910357df9ff/volumes" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.762177 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:41 crc kubenswrapper[4783]: E0131 09:17:41.762373 4783 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:17:41 crc kubenswrapper[4783]: E0131 09:17:41.762402 4783 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:17:41 crc kubenswrapper[4783]: E0131 09:17:41.762468 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift podName:c78f0039-d432-4056-a572-d3049488bb75 nodeName:}" failed. No retries permitted until 2026-01-31 09:17:43.762448879 +0000 UTC m=+774.431132347 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift") pod "swift-storage-0" (UID: "c78f0039-d432-4056-a572-d3049488bb75") : configmap "swift-ring-files" not found Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.765609 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.766805 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.777100 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.777103 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-vl5wj" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.778414 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.778781 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.778904 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.824130 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-58vfp" event={"ID":"6827ccb1-8fcf-4451-a878-25d3d5765ae6","Type":"ContainerStarted","Data":"d3d69d44673ede734038ea42db5c20bacf6e7348dc98495b73b754bfd648ddfc"} Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.825846 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" event={"ID":"3de91956-c9f6-4dda-ab39-1bb28e7b16de","Type":"ContainerStarted","Data":"b2794d18433a398a32d428a7c8f47dac82b451a793c3b64ff22c60f17cd93ede"} Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.826005 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.841212 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" podStartSLOduration=3.841195689 podStartE2EDuration="3.841195689s" podCreationTimestamp="2026-01-31 09:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:17:41.839282413 +0000 UTC m=+772.507965882" watchObservedRunningTime="2026-01-31 09:17:41.841195689 +0000 UTC m=+772.509879156" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.863777 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/273da0f4-592f-4736-8435-28cd6f46ed55-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.863832 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2xtr\" (UniqueName: \"kubernetes.io/projected/273da0f4-592f-4736-8435-28cd6f46ed55-kube-api-access-p2xtr\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.863910 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/273da0f4-592f-4736-8435-28cd6f46ed55-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.864001 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/273da0f4-592f-4736-8435-28cd6f46ed55-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.864184 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/273da0f4-592f-4736-8435-28cd6f46ed55-config\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.864225 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273da0f4-592f-4736-8435-28cd6f46ed55-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.864271 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/273da0f4-592f-4736-8435-28cd6f46ed55-scripts\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.965217 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/273da0f4-592f-4736-8435-28cd6f46ed55-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.965677 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/273da0f4-592f-4736-8435-28cd6f46ed55-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.965764 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/273da0f4-592f-4736-8435-28cd6f46ed55-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.965933 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/273da0f4-592f-4736-8435-28cd6f46ed55-config\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.966038 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273da0f4-592f-4736-8435-28cd6f46ed55-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.966131 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/273da0f4-592f-4736-8435-28cd6f46ed55-scripts\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.966501 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/273da0f4-592f-4736-8435-28cd6f46ed55-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.966613 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2xtr\" (UniqueName: \"kubernetes.io/projected/273da0f4-592f-4736-8435-28cd6f46ed55-kube-api-access-p2xtr\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.966798 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/273da0f4-592f-4736-8435-28cd6f46ed55-config\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.967389 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/273da0f4-592f-4736-8435-28cd6f46ed55-scripts\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.973014 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273da0f4-592f-4736-8435-28cd6f46ed55-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.973246 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/273da0f4-592f-4736-8435-28cd6f46ed55-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.973838 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/273da0f4-592f-4736-8435-28cd6f46ed55-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:41 crc kubenswrapper[4783]: I0131 09:17:41.980979 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2xtr\" (UniqueName: \"kubernetes.io/projected/273da0f4-592f-4736-8435-28cd6f46ed55-kube-api-access-p2xtr\") pod \"ovn-northd-0\" (UID: \"273da0f4-592f-4736-8435-28cd6f46ed55\") " pod="openstack/ovn-northd-0" Jan 31 09:17:42 crc kubenswrapper[4783]: I0131 09:17:42.080811 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 31 09:17:42 crc kubenswrapper[4783]: I0131 09:17:42.499419 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 31 09:17:42 crc kubenswrapper[4783]: W0131 09:17:42.507865 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod273da0f4_592f_4736_8435_28cd6f46ed55.slice/crio-e23edf36e622fc594481478977bf2e2ead5e027345652c5246049d671c58d1f4 WatchSource:0}: Error finding container e23edf36e622fc594481478977bf2e2ead5e027345652c5246049d671c58d1f4: Status 404 returned error can't find the container with id e23edf36e622fc594481478977bf2e2ead5e027345652c5246049d671c58d1f4 Jan 31 09:17:42 crc kubenswrapper[4783]: I0131 09:17:42.834923 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"273da0f4-592f-4736-8435-28cd6f46ed55","Type":"ContainerStarted","Data":"e23edf36e622fc594481478977bf2e2ead5e027345652c5246049d671c58d1f4"} Jan 31 09:17:43 crc kubenswrapper[4783]: I0131 09:17:43.798972 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:43 crc kubenswrapper[4783]: E0131 09:17:43.799231 4783 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:17:43 crc kubenswrapper[4783]: E0131 09:17:43.799264 4783 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:17:43 crc kubenswrapper[4783]: E0131 09:17:43.799336 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift podName:c78f0039-d432-4056-a572-d3049488bb75 nodeName:}" failed. No retries permitted until 2026-01-31 09:17:47.799309039 +0000 UTC m=+778.467992507 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift") pod "swift-storage-0" (UID: "c78f0039-d432-4056-a572-d3049488bb75") : configmap "swift-ring-files" not found Jan 31 09:17:43 crc kubenswrapper[4783]: I0131 09:17:43.848193 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"273da0f4-592f-4736-8435-28cd6f46ed55","Type":"ContainerStarted","Data":"4853e8a0ad7bd28145edd0bce3f02d4049f67cdf1e3a455f2d5b19832371f315"} Jan 31 09:17:43 crc kubenswrapper[4783]: I0131 09:17:43.848244 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"273da0f4-592f-4736-8435-28cd6f46ed55","Type":"ContainerStarted","Data":"7f86db47490f270efb1625ab9a62be915155230d128c8bfb62205c2b41a64da7"} Jan 31 09:17:43 crc kubenswrapper[4783]: I0131 09:17:43.848352 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 31 09:17:43 crc kubenswrapper[4783]: I0131 09:17:43.873014 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.893294532 podStartE2EDuration="2.872963291s" podCreationTimestamp="2026-01-31 09:17:41 +0000 UTC" firstStartedPulling="2026-01-31 09:17:42.510134473 +0000 UTC m=+773.178817940" lastFinishedPulling="2026-01-31 09:17:43.489803231 +0000 UTC m=+774.158486699" observedRunningTime="2026-01-31 09:17:43.865745489 +0000 UTC m=+774.534428957" watchObservedRunningTime="2026-01-31 09:17:43.872963291 +0000 UTC m=+774.541646759" Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.287617 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.287959 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.359760 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.462528 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bmgzl"] Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.464146 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bmgzl" Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.466038 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.471764 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bmgzl"] Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.632775 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffccfe82-01de-494d-ba4a-a59e4c242a7c-operator-scripts\") pod \"root-account-create-update-bmgzl\" (UID: \"ffccfe82-01de-494d-ba4a-a59e4c242a7c\") " pod="openstack/root-account-create-update-bmgzl" Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.632821 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xrql\" (UniqueName: \"kubernetes.io/projected/ffccfe82-01de-494d-ba4a-a59e4c242a7c-kube-api-access-5xrql\") pod \"root-account-create-update-bmgzl\" (UID: \"ffccfe82-01de-494d-ba4a-a59e4c242a7c\") " pod="openstack/root-account-create-update-bmgzl" Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.733867 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffccfe82-01de-494d-ba4a-a59e4c242a7c-operator-scripts\") pod \"root-account-create-update-bmgzl\" (UID: \"ffccfe82-01de-494d-ba4a-a59e4c242a7c\") " pod="openstack/root-account-create-update-bmgzl" Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.733931 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xrql\" (UniqueName: \"kubernetes.io/projected/ffccfe82-01de-494d-ba4a-a59e4c242a7c-kube-api-access-5xrql\") pod \"root-account-create-update-bmgzl\" (UID: \"ffccfe82-01de-494d-ba4a-a59e4c242a7c\") " pod="openstack/root-account-create-update-bmgzl" Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.734717 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffccfe82-01de-494d-ba4a-a59e4c242a7c-operator-scripts\") pod \"root-account-create-update-bmgzl\" (UID: \"ffccfe82-01de-494d-ba4a-a59e4c242a7c\") " pod="openstack/root-account-create-update-bmgzl" Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.750736 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xrql\" (UniqueName: \"kubernetes.io/projected/ffccfe82-01de-494d-ba4a-a59e4c242a7c-kube-api-access-5xrql\") pod \"root-account-create-update-bmgzl\" (UID: \"ffccfe82-01de-494d-ba4a-a59e4c242a7c\") " pod="openstack/root-account-create-update-bmgzl" Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.781462 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bmgzl" Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.874131 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-58vfp" event={"ID":"6827ccb1-8fcf-4451-a878-25d3d5765ae6","Type":"ContainerStarted","Data":"85538127a05d852d4d1cb118623cdc93eb3c6bfaf54e6ebd4130e0e33b36cb19"} Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.896445 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-58vfp" podStartSLOduration=1.8318416210000001 podStartE2EDuration="5.896430111s" podCreationTimestamp="2026-01-31 09:17:40 +0000 UTC" firstStartedPulling="2026-01-31 09:17:41.251842936 +0000 UTC m=+771.920526404" lastFinishedPulling="2026-01-31 09:17:45.316431426 +0000 UTC m=+775.985114894" observedRunningTime="2026-01-31 09:17:45.894230035 +0000 UTC m=+776.562913503" watchObservedRunningTime="2026-01-31 09:17:45.896430111 +0000 UTC m=+776.565113578" Jan 31 09:17:45 crc kubenswrapper[4783]: I0131 09:17:45.941482 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.194662 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bmgzl"] Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.667667 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-jf4sp"] Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.668841 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jf4sp" Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.681690 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jf4sp"] Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.785520 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2ed3-account-create-update-xmcll"] Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.786750 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2ed3-account-create-update-xmcll" Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.788776 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.792751 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2ed3-account-create-update-xmcll"] Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.855933 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g9gr\" (UniqueName: \"kubernetes.io/projected/ec5fc027-5dcc-47d7-972a-ddf14c314725-kube-api-access-7g9gr\") pod \"keystone-db-create-jf4sp\" (UID: \"ec5fc027-5dcc-47d7-972a-ddf14c314725\") " pod="openstack/keystone-db-create-jf4sp" Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.856023 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec5fc027-5dcc-47d7-972a-ddf14c314725-operator-scripts\") pod \"keystone-db-create-jf4sp\" (UID: \"ec5fc027-5dcc-47d7-972a-ddf14c314725\") " pod="openstack/keystone-db-create-jf4sp" Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.892090 4783 generic.go:334] "Generic (PLEG): container finished" podID="ffccfe82-01de-494d-ba4a-a59e4c242a7c" containerID="9c8955ce5a44a6f8dab9197031364dc10ab88c7ade6d4c9985fb2908b0375e1c" exitCode=0 Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.892176 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bmgzl" event={"ID":"ffccfe82-01de-494d-ba4a-a59e4c242a7c","Type":"ContainerDied","Data":"9c8955ce5a44a6f8dab9197031364dc10ab88c7ade6d4c9985fb2908b0375e1c"} Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.892782 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bmgzl" event={"ID":"ffccfe82-01de-494d-ba4a-a59e4c242a7c","Type":"ContainerStarted","Data":"765bb9397e2f2e2a85dadc642ba5fea03ea827c3528d8835ed3c83484df26e78"} Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.957808 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdqzn\" (UniqueName: \"kubernetes.io/projected/046f6bf6-2c59-43f9-8964-5949209241b5-kube-api-access-hdqzn\") pod \"keystone-2ed3-account-create-update-xmcll\" (UID: \"046f6bf6-2c59-43f9-8964-5949209241b5\") " pod="openstack/keystone-2ed3-account-create-update-xmcll" Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.957892 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g9gr\" (UniqueName: \"kubernetes.io/projected/ec5fc027-5dcc-47d7-972a-ddf14c314725-kube-api-access-7g9gr\") pod \"keystone-db-create-jf4sp\" (UID: \"ec5fc027-5dcc-47d7-972a-ddf14c314725\") " pod="openstack/keystone-db-create-jf4sp" Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.957986 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec5fc027-5dcc-47d7-972a-ddf14c314725-operator-scripts\") pod \"keystone-db-create-jf4sp\" (UID: \"ec5fc027-5dcc-47d7-972a-ddf14c314725\") " pod="openstack/keystone-db-create-jf4sp" Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.958007 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/046f6bf6-2c59-43f9-8964-5949209241b5-operator-scripts\") pod \"keystone-2ed3-account-create-update-xmcll\" (UID: \"046f6bf6-2c59-43f9-8964-5949209241b5\") " pod="openstack/keystone-2ed3-account-create-update-xmcll" Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.958789 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec5fc027-5dcc-47d7-972a-ddf14c314725-operator-scripts\") pod \"keystone-db-create-jf4sp\" (UID: \"ec5fc027-5dcc-47d7-972a-ddf14c314725\") " pod="openstack/keystone-db-create-jf4sp" Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.989368 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g9gr\" (UniqueName: \"kubernetes.io/projected/ec5fc027-5dcc-47d7-972a-ddf14c314725-kube-api-access-7g9gr\") pod \"keystone-db-create-jf4sp\" (UID: \"ec5fc027-5dcc-47d7-972a-ddf14c314725\") " pod="openstack/keystone-db-create-jf4sp" Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.995433 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-9cg97"] Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.996756 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9cg97" Jan 31 09:17:46 crc kubenswrapper[4783]: I0131 09:17:46.999612 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-9cg97"] Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.059040 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/046f6bf6-2c59-43f9-8964-5949209241b5-operator-scripts\") pod \"keystone-2ed3-account-create-update-xmcll\" (UID: \"046f6bf6-2c59-43f9-8964-5949209241b5\") " pod="openstack/keystone-2ed3-account-create-update-xmcll" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.059676 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/046f6bf6-2c59-43f9-8964-5949209241b5-operator-scripts\") pod \"keystone-2ed3-account-create-update-xmcll\" (UID: \"046f6bf6-2c59-43f9-8964-5949209241b5\") " pod="openstack/keystone-2ed3-account-create-update-xmcll" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.059678 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdqzn\" (UniqueName: \"kubernetes.io/projected/046f6bf6-2c59-43f9-8964-5949209241b5-kube-api-access-hdqzn\") pod \"keystone-2ed3-account-create-update-xmcll\" (UID: \"046f6bf6-2c59-43f9-8964-5949209241b5\") " pod="openstack/keystone-2ed3-account-create-update-xmcll" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.074760 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b696-account-create-update-4xlwd"] Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.075202 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdqzn\" (UniqueName: \"kubernetes.io/projected/046f6bf6-2c59-43f9-8964-5949209241b5-kube-api-access-hdqzn\") pod \"keystone-2ed3-account-create-update-xmcll\" (UID: \"046f6bf6-2c59-43f9-8964-5949209241b5\") " pod="openstack/keystone-2ed3-account-create-update-xmcll" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.075869 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b696-account-create-update-4xlwd" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.079737 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.080289 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b696-account-create-update-4xlwd"] Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.099364 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2ed3-account-create-update-xmcll" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.162138 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07cd4eec-ab95-4246-ab07-30bd4b8d6b9e-operator-scripts\") pod \"placement-db-create-9cg97\" (UID: \"07cd4eec-ab95-4246-ab07-30bd4b8d6b9e\") " pod="openstack/placement-db-create-9cg97" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.163209 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvkx4\" (UniqueName: \"kubernetes.io/projected/07cd4eec-ab95-4246-ab07-30bd4b8d6b9e-kube-api-access-dvkx4\") pod \"placement-db-create-9cg97\" (UID: \"07cd4eec-ab95-4246-ab07-30bd4b8d6b9e\") " pod="openstack/placement-db-create-9cg97" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.265233 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdad584-8c0f-433e-b36d-1a8584cecc18-operator-scripts\") pod \"placement-b696-account-create-update-4xlwd\" (UID: \"6bdad584-8c0f-433e-b36d-1a8584cecc18\") " pod="openstack/placement-b696-account-create-update-4xlwd" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.265275 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww6th\" (UniqueName: \"kubernetes.io/projected/6bdad584-8c0f-433e-b36d-1a8584cecc18-kube-api-access-ww6th\") pod \"placement-b696-account-create-update-4xlwd\" (UID: \"6bdad584-8c0f-433e-b36d-1a8584cecc18\") " pod="openstack/placement-b696-account-create-update-4xlwd" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.265667 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvkx4\" (UniqueName: \"kubernetes.io/projected/07cd4eec-ab95-4246-ab07-30bd4b8d6b9e-kube-api-access-dvkx4\") pod \"placement-db-create-9cg97\" (UID: \"07cd4eec-ab95-4246-ab07-30bd4b8d6b9e\") " pod="openstack/placement-db-create-9cg97" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.265713 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07cd4eec-ab95-4246-ab07-30bd4b8d6b9e-operator-scripts\") pod \"placement-db-create-9cg97\" (UID: \"07cd4eec-ab95-4246-ab07-30bd4b8d6b9e\") " pod="openstack/placement-db-create-9cg97" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.266460 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07cd4eec-ab95-4246-ab07-30bd4b8d6b9e-operator-scripts\") pod \"placement-db-create-9cg97\" (UID: \"07cd4eec-ab95-4246-ab07-30bd4b8d6b9e\") " pod="openstack/placement-db-create-9cg97" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.280753 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-czqdq"] Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.281550 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jf4sp" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.281768 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-czqdq" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.286439 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-czqdq"] Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.289620 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvkx4\" (UniqueName: \"kubernetes.io/projected/07cd4eec-ab95-4246-ab07-30bd4b8d6b9e-kube-api-access-dvkx4\") pod \"placement-db-create-9cg97\" (UID: \"07cd4eec-ab95-4246-ab07-30bd4b8d6b9e\") " pod="openstack/placement-db-create-9cg97" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.322047 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9cg97" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.367712 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdad584-8c0f-433e-b36d-1a8584cecc18-operator-scripts\") pod \"placement-b696-account-create-update-4xlwd\" (UID: \"6bdad584-8c0f-433e-b36d-1a8584cecc18\") " pod="openstack/placement-b696-account-create-update-4xlwd" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.367939 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww6th\" (UniqueName: \"kubernetes.io/projected/6bdad584-8c0f-433e-b36d-1a8584cecc18-kube-api-access-ww6th\") pod \"placement-b696-account-create-update-4xlwd\" (UID: \"6bdad584-8c0f-433e-b36d-1a8584cecc18\") " pod="openstack/placement-b696-account-create-update-4xlwd" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.369478 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdad584-8c0f-433e-b36d-1a8584cecc18-operator-scripts\") pod \"placement-b696-account-create-update-4xlwd\" (UID: \"6bdad584-8c0f-433e-b36d-1a8584cecc18\") " pod="openstack/placement-b696-account-create-update-4xlwd" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.386362 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww6th\" (UniqueName: \"kubernetes.io/projected/6bdad584-8c0f-433e-b36d-1a8584cecc18-kube-api-access-ww6th\") pod \"placement-b696-account-create-update-4xlwd\" (UID: \"6bdad584-8c0f-433e-b36d-1a8584cecc18\") " pod="openstack/placement-b696-account-create-update-4xlwd" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.388191 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-18c3-account-create-update-bb8tv"] Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.389089 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-18c3-account-create-update-bb8tv" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.391021 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.398612 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-18c3-account-create-update-bb8tv"] Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.418325 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b696-account-create-update-4xlwd" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.469047 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m4kh\" (UniqueName: \"kubernetes.io/projected/fd3660ec-5394-45d9-ae35-5c23e4749178-kube-api-access-7m4kh\") pod \"glance-db-create-czqdq\" (UID: \"fd3660ec-5394-45d9-ae35-5c23e4749178\") " pod="openstack/glance-db-create-czqdq" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.469095 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd3660ec-5394-45d9-ae35-5c23e4749178-operator-scripts\") pod \"glance-db-create-czqdq\" (UID: \"fd3660ec-5394-45d9-ae35-5c23e4749178\") " pod="openstack/glance-db-create-czqdq" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.484156 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2ed3-account-create-update-xmcll"] Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.574928 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptkpb\" (UniqueName: \"kubernetes.io/projected/3f714661-d55e-4c8f-b2c2-8420206b1a72-kube-api-access-ptkpb\") pod \"glance-18c3-account-create-update-bb8tv\" (UID: \"3f714661-d55e-4c8f-b2c2-8420206b1a72\") " pod="openstack/glance-18c3-account-create-update-bb8tv" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.574994 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m4kh\" (UniqueName: \"kubernetes.io/projected/fd3660ec-5394-45d9-ae35-5c23e4749178-kube-api-access-7m4kh\") pod \"glance-db-create-czqdq\" (UID: \"fd3660ec-5394-45d9-ae35-5c23e4749178\") " pod="openstack/glance-db-create-czqdq" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.575031 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd3660ec-5394-45d9-ae35-5c23e4749178-operator-scripts\") pod \"glance-db-create-czqdq\" (UID: \"fd3660ec-5394-45d9-ae35-5c23e4749178\") " pod="openstack/glance-db-create-czqdq" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.575124 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f714661-d55e-4c8f-b2c2-8420206b1a72-operator-scripts\") pod \"glance-18c3-account-create-update-bb8tv\" (UID: \"3f714661-d55e-4c8f-b2c2-8420206b1a72\") " pod="openstack/glance-18c3-account-create-update-bb8tv" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.576227 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd3660ec-5394-45d9-ae35-5c23e4749178-operator-scripts\") pod \"glance-db-create-czqdq\" (UID: \"fd3660ec-5394-45d9-ae35-5c23e4749178\") " pod="openstack/glance-db-create-czqdq" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.591267 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m4kh\" (UniqueName: \"kubernetes.io/projected/fd3660ec-5394-45d9-ae35-5c23e4749178-kube-api-access-7m4kh\") pod \"glance-db-create-czqdq\" (UID: \"fd3660ec-5394-45d9-ae35-5c23e4749178\") " pod="openstack/glance-db-create-czqdq" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.623870 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-czqdq" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.678941 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f714661-d55e-4c8f-b2c2-8420206b1a72-operator-scripts\") pod \"glance-18c3-account-create-update-bb8tv\" (UID: \"3f714661-d55e-4c8f-b2c2-8420206b1a72\") " pod="openstack/glance-18c3-account-create-update-bb8tv" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.679513 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptkpb\" (UniqueName: \"kubernetes.io/projected/3f714661-d55e-4c8f-b2c2-8420206b1a72-kube-api-access-ptkpb\") pod \"glance-18c3-account-create-update-bb8tv\" (UID: \"3f714661-d55e-4c8f-b2c2-8420206b1a72\") " pod="openstack/glance-18c3-account-create-update-bb8tv" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.679665 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f714661-d55e-4c8f-b2c2-8420206b1a72-operator-scripts\") pod \"glance-18c3-account-create-update-bb8tv\" (UID: \"3f714661-d55e-4c8f-b2c2-8420206b1a72\") " pod="openstack/glance-18c3-account-create-update-bb8tv" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.698972 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptkpb\" (UniqueName: \"kubernetes.io/projected/3f714661-d55e-4c8f-b2c2-8420206b1a72-kube-api-access-ptkpb\") pod \"glance-18c3-account-create-update-bb8tv\" (UID: \"3f714661-d55e-4c8f-b2c2-8420206b1a72\") " pod="openstack/glance-18c3-account-create-update-bb8tv" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.703364 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jf4sp"] Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.704952 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-18c3-account-create-update-bb8tv" Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.764802 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-9cg97"] Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.846671 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b696-account-create-update-4xlwd"] Jan 31 09:17:47 crc kubenswrapper[4783]: W0131 09:17:47.871522 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bdad584_8c0f_433e_b36d_1a8584cecc18.slice/crio-021287c124a6fc5992488e6ded2851e4ebca76f4081c99f274d5621a5f6c8e4c WatchSource:0}: Error finding container 021287c124a6fc5992488e6ded2851e4ebca76f4081c99f274d5621a5f6c8e4c: Status 404 returned error can't find the container with id 021287c124a6fc5992488e6ded2851e4ebca76f4081c99f274d5621a5f6c8e4c Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.883088 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:47 crc kubenswrapper[4783]: E0131 09:17:47.883322 4783 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:17:47 crc kubenswrapper[4783]: E0131 09:17:47.883363 4783 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:17:47 crc kubenswrapper[4783]: E0131 09:17:47.883420 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift podName:c78f0039-d432-4056-a572-d3049488bb75 nodeName:}" failed. No retries permitted until 2026-01-31 09:17:55.883399109 +0000 UTC m=+786.552082577 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift") pod "swift-storage-0" (UID: "c78f0039-d432-4056-a572-d3049488bb75") : configmap "swift-ring-files" not found Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.900682 4783 generic.go:334] "Generic (PLEG): container finished" podID="046f6bf6-2c59-43f9-8964-5949209241b5" containerID="5ba8e19f22990ec5e7786723159e19551167ec369babf57c45970338eda81d3e" exitCode=0 Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.900750 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2ed3-account-create-update-xmcll" event={"ID":"046f6bf6-2c59-43f9-8964-5949209241b5","Type":"ContainerDied","Data":"5ba8e19f22990ec5e7786723159e19551167ec369babf57c45970338eda81d3e"} Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.900959 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2ed3-account-create-update-xmcll" event={"ID":"046f6bf6-2c59-43f9-8964-5949209241b5","Type":"ContainerStarted","Data":"6bbbfb189f199358bec1993c842211f71b5b1c842e8e36b1f97ed57e9740a6ac"} Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.915312 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b696-account-create-update-4xlwd" event={"ID":"6bdad584-8c0f-433e-b36d-1a8584cecc18","Type":"ContainerStarted","Data":"021287c124a6fc5992488e6ded2851e4ebca76f4081c99f274d5621a5f6c8e4c"} Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.917195 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9cg97" event={"ID":"07cd4eec-ab95-4246-ab07-30bd4b8d6b9e","Type":"ContainerStarted","Data":"55f727a603483fa958c83d8909b4758b49f4add156590a20f43b77f17b050b65"} Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.920512 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jf4sp" event={"ID":"ec5fc027-5dcc-47d7-972a-ddf14c314725","Type":"ContainerStarted","Data":"ba2e5aef78807d8ab62365575c0206ab61f11d3c8bb4987394232545fa2e7001"} Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.920549 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jf4sp" event={"ID":"ec5fc027-5dcc-47d7-972a-ddf14c314725","Type":"ContainerStarted","Data":"1ad0b4131f61d557e8efe9888d253d366a3cea452d642eeb70009ca58cf1de0d"} Jan 31 09:17:47 crc kubenswrapper[4783]: I0131 09:17:47.939870 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-jf4sp" podStartSLOduration=1.939857696 podStartE2EDuration="1.939857696s" podCreationTimestamp="2026-01-31 09:17:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:17:47.9395519 +0000 UTC m=+778.608235368" watchObservedRunningTime="2026-01-31 09:17:47.939857696 +0000 UTC m=+778.608541164" Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.017506 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-czqdq"] Jan 31 09:17:48 crc kubenswrapper[4783]: W0131 09:17:48.125240 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f714661_d55e_4c8f_b2c2_8420206b1a72.slice/crio-730ba921b5b8bd608da2821825fe0ca19397c07610ed7bf14f3ea3dec71f248b WatchSource:0}: Error finding container 730ba921b5b8bd608da2821825fe0ca19397c07610ed7bf14f3ea3dec71f248b: Status 404 returned error can't find the container with id 730ba921b5b8bd608da2821825fe0ca19397c07610ed7bf14f3ea3dec71f248b Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.128150 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-18c3-account-create-update-bb8tv"] Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.320523 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bmgzl" Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.498032 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffccfe82-01de-494d-ba4a-a59e4c242a7c-operator-scripts\") pod \"ffccfe82-01de-494d-ba4a-a59e4c242a7c\" (UID: \"ffccfe82-01de-494d-ba4a-a59e4c242a7c\") " Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.498277 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xrql\" (UniqueName: \"kubernetes.io/projected/ffccfe82-01de-494d-ba4a-a59e4c242a7c-kube-api-access-5xrql\") pod \"ffccfe82-01de-494d-ba4a-a59e4c242a7c\" (UID: \"ffccfe82-01de-494d-ba4a-a59e4c242a7c\") " Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.498663 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffccfe82-01de-494d-ba4a-a59e4c242a7c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ffccfe82-01de-494d-ba4a-a59e4c242a7c" (UID: "ffccfe82-01de-494d-ba4a-a59e4c242a7c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.498954 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffccfe82-01de-494d-ba4a-a59e4c242a7c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.503397 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffccfe82-01de-494d-ba4a-a59e4c242a7c-kube-api-access-5xrql" (OuterVolumeSpecName: "kube-api-access-5xrql") pod "ffccfe82-01de-494d-ba4a-a59e4c242a7c" (UID: "ffccfe82-01de-494d-ba4a-a59e4c242a7c"). InnerVolumeSpecName "kube-api-access-5xrql". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.601080 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xrql\" (UniqueName: \"kubernetes.io/projected/ffccfe82-01de-494d-ba4a-a59e4c242a7c-kube-api-access-5xrql\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.932003 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b696-account-create-update-4xlwd" event={"ID":"6bdad584-8c0f-433e-b36d-1a8584cecc18","Type":"ContainerDied","Data":"0c9a8cb448fa4629e4e00d219bf81c36a6956474c9eea01dc7f29310e8cda6fd"} Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.931810 4783 generic.go:334] "Generic (PLEG): container finished" podID="6bdad584-8c0f-433e-b36d-1a8584cecc18" containerID="0c9a8cb448fa4629e4e00d219bf81c36a6956474c9eea01dc7f29310e8cda6fd" exitCode=0 Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.935457 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bmgzl" Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.935487 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bmgzl" event={"ID":"ffccfe82-01de-494d-ba4a-a59e4c242a7c","Type":"ContainerDied","Data":"765bb9397e2f2e2a85dadc642ba5fea03ea827c3528d8835ed3c83484df26e78"} Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.935589 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="765bb9397e2f2e2a85dadc642ba5fea03ea827c3528d8835ed3c83484df26e78" Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.938188 4783 generic.go:334] "Generic (PLEG): container finished" podID="07cd4eec-ab95-4246-ab07-30bd4b8d6b9e" containerID="3a68dc5e2c08d6cd779d6d73430d6226a5f55ea0545148e20ccb80c4b4020d6a" exitCode=0 Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.938273 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9cg97" event={"ID":"07cd4eec-ab95-4246-ab07-30bd4b8d6b9e","Type":"ContainerDied","Data":"3a68dc5e2c08d6cd779d6d73430d6226a5f55ea0545148e20ccb80c4b4020d6a"} Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.943694 4783 generic.go:334] "Generic (PLEG): container finished" podID="ec5fc027-5dcc-47d7-972a-ddf14c314725" containerID="ba2e5aef78807d8ab62365575c0206ab61f11d3c8bb4987394232545fa2e7001" exitCode=0 Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.943758 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jf4sp" event={"ID":"ec5fc027-5dcc-47d7-972a-ddf14c314725","Type":"ContainerDied","Data":"ba2e5aef78807d8ab62365575c0206ab61f11d3c8bb4987394232545fa2e7001"} Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.945651 4783 generic.go:334] "Generic (PLEG): container finished" podID="3f714661-d55e-4c8f-b2c2-8420206b1a72" containerID="0a930ede93072e683441e4fb215175a2a0361872579f82d076792fa9c90a3bcb" exitCode=0 Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.945767 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-18c3-account-create-update-bb8tv" event={"ID":"3f714661-d55e-4c8f-b2c2-8420206b1a72","Type":"ContainerDied","Data":"0a930ede93072e683441e4fb215175a2a0361872579f82d076792fa9c90a3bcb"} Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.945797 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-18c3-account-create-update-bb8tv" event={"ID":"3f714661-d55e-4c8f-b2c2-8420206b1a72","Type":"ContainerStarted","Data":"730ba921b5b8bd608da2821825fe0ca19397c07610ed7bf14f3ea3dec71f248b"} Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.952286 4783 generic.go:334] "Generic (PLEG): container finished" podID="fd3660ec-5394-45d9-ae35-5c23e4749178" containerID="4113d6f5019e94f91242acbd37473b5b69b8d49e9cf65b4cec6f0a499f7266ae" exitCode=0 Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.952847 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-czqdq" event={"ID":"fd3660ec-5394-45d9-ae35-5c23e4749178","Type":"ContainerDied","Data":"4113d6f5019e94f91242acbd37473b5b69b8d49e9cf65b4cec6f0a499f7266ae"} Jan 31 09:17:48 crc kubenswrapper[4783]: I0131 09:17:48.952934 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-czqdq" event={"ID":"fd3660ec-5394-45d9-ae35-5c23e4749178","Type":"ContainerStarted","Data":"1d6aa9e993da41a85b870793a751b94812816ab2d015eddbf3307bd3b2c1d83b"} Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.249238 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2ed3-account-create-update-xmcll" Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.301256 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.342983 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-h29qg"] Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.343277 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" podUID="009682a7-982d-4cc3-9d7f-704c0c7c8d84" containerName="dnsmasq-dns" containerID="cri-o://b162037c53c145a86dca751a4edc5a38838c8634caf635406ab3e95a804bc377" gracePeriod=10 Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.418063 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/046f6bf6-2c59-43f9-8964-5949209241b5-operator-scripts\") pod \"046f6bf6-2c59-43f9-8964-5949209241b5\" (UID: \"046f6bf6-2c59-43f9-8964-5949209241b5\") " Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.418156 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdqzn\" (UniqueName: \"kubernetes.io/projected/046f6bf6-2c59-43f9-8964-5949209241b5-kube-api-access-hdqzn\") pod \"046f6bf6-2c59-43f9-8964-5949209241b5\" (UID: \"046f6bf6-2c59-43f9-8964-5949209241b5\") " Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.418772 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/046f6bf6-2c59-43f9-8964-5949209241b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "046f6bf6-2c59-43f9-8964-5949209241b5" (UID: "046f6bf6-2c59-43f9-8964-5949209241b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.443690 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/046f6bf6-2c59-43f9-8964-5949209241b5-kube-api-access-hdqzn" (OuterVolumeSpecName: "kube-api-access-hdqzn") pod "046f6bf6-2c59-43f9-8964-5949209241b5" (UID: "046f6bf6-2c59-43f9-8964-5949209241b5"). InnerVolumeSpecName "kube-api-access-hdqzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.520997 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/046f6bf6-2c59-43f9-8964-5949209241b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.521032 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdqzn\" (UniqueName: \"kubernetes.io/projected/046f6bf6-2c59-43f9-8964-5949209241b5-kube-api-access-hdqzn\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.704665 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.826437 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/009682a7-982d-4cc3-9d7f-704c0c7c8d84-config\") pod \"009682a7-982d-4cc3-9d7f-704c0c7c8d84\" (UID: \"009682a7-982d-4cc3-9d7f-704c0c7c8d84\") " Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.826516 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/009682a7-982d-4cc3-9d7f-704c0c7c8d84-dns-svc\") pod \"009682a7-982d-4cc3-9d7f-704c0c7c8d84\" (UID: \"009682a7-982d-4cc3-9d7f-704c0c7c8d84\") " Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.826632 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szmk5\" (UniqueName: \"kubernetes.io/projected/009682a7-982d-4cc3-9d7f-704c0c7c8d84-kube-api-access-szmk5\") pod \"009682a7-982d-4cc3-9d7f-704c0c7c8d84\" (UID: \"009682a7-982d-4cc3-9d7f-704c0c7c8d84\") " Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.831508 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/009682a7-982d-4cc3-9d7f-704c0c7c8d84-kube-api-access-szmk5" (OuterVolumeSpecName: "kube-api-access-szmk5") pod "009682a7-982d-4cc3-9d7f-704c0c7c8d84" (UID: "009682a7-982d-4cc3-9d7f-704c0c7c8d84"). InnerVolumeSpecName "kube-api-access-szmk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.857210 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/009682a7-982d-4cc3-9d7f-704c0c7c8d84-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "009682a7-982d-4cc3-9d7f-704c0c7c8d84" (UID: "009682a7-982d-4cc3-9d7f-704c0c7c8d84"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.860497 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/009682a7-982d-4cc3-9d7f-704c0c7c8d84-config" (OuterVolumeSpecName: "config") pod "009682a7-982d-4cc3-9d7f-704c0c7c8d84" (UID: "009682a7-982d-4cc3-9d7f-704c0c7c8d84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.931536 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szmk5\" (UniqueName: \"kubernetes.io/projected/009682a7-982d-4cc3-9d7f-704c0c7c8d84-kube-api-access-szmk5\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.931562 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/009682a7-982d-4cc3-9d7f-704c0c7c8d84-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.931574 4783 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/009682a7-982d-4cc3-9d7f-704c0c7c8d84-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.967322 4783 generic.go:334] "Generic (PLEG): container finished" podID="009682a7-982d-4cc3-9d7f-704c0c7c8d84" containerID="b162037c53c145a86dca751a4edc5a38838c8634caf635406ab3e95a804bc377" exitCode=0 Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.967422 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" event={"ID":"009682a7-982d-4cc3-9d7f-704c0c7c8d84","Type":"ContainerDied","Data":"b162037c53c145a86dca751a4edc5a38838c8634caf635406ab3e95a804bc377"} Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.967470 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" event={"ID":"009682a7-982d-4cc3-9d7f-704c0c7c8d84","Type":"ContainerDied","Data":"c80572e43e1bedfb4db431e6822af3e7b60be2bb7ab73dfd114ca55becb2240e"} Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.967469 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-h29qg" Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.967493 4783 scope.go:117] "RemoveContainer" containerID="b162037c53c145a86dca751a4edc5a38838c8634caf635406ab3e95a804bc377" Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.969621 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2ed3-account-create-update-xmcll" Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.969912 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2ed3-account-create-update-xmcll" event={"ID":"046f6bf6-2c59-43f9-8964-5949209241b5","Type":"ContainerDied","Data":"6bbbfb189f199358bec1993c842211f71b5b1c842e8e36b1f97ed57e9740a6ac"} Jan 31 09:17:49 crc kubenswrapper[4783]: I0131 09:17:49.969965 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bbbfb189f199358bec1993c842211f71b5b1c842e8e36b1f97ed57e9740a6ac" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.015620 4783 scope.go:117] "RemoveContainer" containerID="1b2b225ab522194da1bccddc9113dd6c8a852ee113c3de778c208b32d435c8a0" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.049322 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-h29qg"] Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.058311 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-h29qg"] Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.076683 4783 scope.go:117] "RemoveContainer" containerID="b162037c53c145a86dca751a4edc5a38838c8634caf635406ab3e95a804bc377" Jan 31 09:17:50 crc kubenswrapper[4783]: E0131 09:17:50.077359 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b162037c53c145a86dca751a4edc5a38838c8634caf635406ab3e95a804bc377\": container with ID starting with b162037c53c145a86dca751a4edc5a38838c8634caf635406ab3e95a804bc377 not found: ID does not exist" containerID="b162037c53c145a86dca751a4edc5a38838c8634caf635406ab3e95a804bc377" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.077392 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b162037c53c145a86dca751a4edc5a38838c8634caf635406ab3e95a804bc377"} err="failed to get container status \"b162037c53c145a86dca751a4edc5a38838c8634caf635406ab3e95a804bc377\": rpc error: code = NotFound desc = could not find container \"b162037c53c145a86dca751a4edc5a38838c8634caf635406ab3e95a804bc377\": container with ID starting with b162037c53c145a86dca751a4edc5a38838c8634caf635406ab3e95a804bc377 not found: ID does not exist" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.077415 4783 scope.go:117] "RemoveContainer" containerID="1b2b225ab522194da1bccddc9113dd6c8a852ee113c3de778c208b32d435c8a0" Jan 31 09:17:50 crc kubenswrapper[4783]: E0131 09:17:50.077774 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b2b225ab522194da1bccddc9113dd6c8a852ee113c3de778c208b32d435c8a0\": container with ID starting with 1b2b225ab522194da1bccddc9113dd6c8a852ee113c3de778c208b32d435c8a0 not found: ID does not exist" containerID="1b2b225ab522194da1bccddc9113dd6c8a852ee113c3de778c208b32d435c8a0" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.077795 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b2b225ab522194da1bccddc9113dd6c8a852ee113c3de778c208b32d435c8a0"} err="failed to get container status \"1b2b225ab522194da1bccddc9113dd6c8a852ee113c3de778c208b32d435c8a0\": rpc error: code = NotFound desc = could not find container \"1b2b225ab522194da1bccddc9113dd6c8a852ee113c3de778c208b32d435c8a0\": container with ID starting with 1b2b225ab522194da1bccddc9113dd6c8a852ee113c3de778c208b32d435c8a0 not found: ID does not exist" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.319551 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9cg97" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.446315 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07cd4eec-ab95-4246-ab07-30bd4b8d6b9e-operator-scripts\") pod \"07cd4eec-ab95-4246-ab07-30bd4b8d6b9e\" (UID: \"07cd4eec-ab95-4246-ab07-30bd4b8d6b9e\") " Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.446828 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvkx4\" (UniqueName: \"kubernetes.io/projected/07cd4eec-ab95-4246-ab07-30bd4b8d6b9e-kube-api-access-dvkx4\") pod \"07cd4eec-ab95-4246-ab07-30bd4b8d6b9e\" (UID: \"07cd4eec-ab95-4246-ab07-30bd4b8d6b9e\") " Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.447175 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07cd4eec-ab95-4246-ab07-30bd4b8d6b9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "07cd4eec-ab95-4246-ab07-30bd4b8d6b9e" (UID: "07cd4eec-ab95-4246-ab07-30bd4b8d6b9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.448990 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07cd4eec-ab95-4246-ab07-30bd4b8d6b9e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.456337 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07cd4eec-ab95-4246-ab07-30bd4b8d6b9e-kube-api-access-dvkx4" (OuterVolumeSpecName: "kube-api-access-dvkx4") pod "07cd4eec-ab95-4246-ab07-30bd4b8d6b9e" (UID: "07cd4eec-ab95-4246-ab07-30bd4b8d6b9e"). InnerVolumeSpecName "kube-api-access-dvkx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.481019 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b696-account-create-update-4xlwd" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.485360 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jf4sp" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.489620 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-czqdq" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.498528 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-18c3-account-create-update-bb8tv" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.551879 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvkx4\" (UniqueName: \"kubernetes.io/projected/07cd4eec-ab95-4246-ab07-30bd4b8d6b9e-kube-api-access-dvkx4\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.653011 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f714661-d55e-4c8f-b2c2-8420206b1a72-operator-scripts\") pod \"3f714661-d55e-4c8f-b2c2-8420206b1a72\" (UID: \"3f714661-d55e-4c8f-b2c2-8420206b1a72\") " Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.653117 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec5fc027-5dcc-47d7-972a-ddf14c314725-operator-scripts\") pod \"ec5fc027-5dcc-47d7-972a-ddf14c314725\" (UID: \"ec5fc027-5dcc-47d7-972a-ddf14c314725\") " Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.653188 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww6th\" (UniqueName: \"kubernetes.io/projected/6bdad584-8c0f-433e-b36d-1a8584cecc18-kube-api-access-ww6th\") pod \"6bdad584-8c0f-433e-b36d-1a8584cecc18\" (UID: \"6bdad584-8c0f-433e-b36d-1a8584cecc18\") " Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.653320 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptkpb\" (UniqueName: \"kubernetes.io/projected/3f714661-d55e-4c8f-b2c2-8420206b1a72-kube-api-access-ptkpb\") pod \"3f714661-d55e-4c8f-b2c2-8420206b1a72\" (UID: \"3f714661-d55e-4c8f-b2c2-8420206b1a72\") " Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.653383 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m4kh\" (UniqueName: \"kubernetes.io/projected/fd3660ec-5394-45d9-ae35-5c23e4749178-kube-api-access-7m4kh\") pod \"fd3660ec-5394-45d9-ae35-5c23e4749178\" (UID: \"fd3660ec-5394-45d9-ae35-5c23e4749178\") " Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.653472 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdad584-8c0f-433e-b36d-1a8584cecc18-operator-scripts\") pod \"6bdad584-8c0f-433e-b36d-1a8584cecc18\" (UID: \"6bdad584-8c0f-433e-b36d-1a8584cecc18\") " Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.653576 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f714661-d55e-4c8f-b2c2-8420206b1a72-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f714661-d55e-4c8f-b2c2-8420206b1a72" (UID: "3f714661-d55e-4c8f-b2c2-8420206b1a72"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.653591 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd3660ec-5394-45d9-ae35-5c23e4749178-operator-scripts\") pod \"fd3660ec-5394-45d9-ae35-5c23e4749178\" (UID: \"fd3660ec-5394-45d9-ae35-5c23e4749178\") " Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.653869 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g9gr\" (UniqueName: \"kubernetes.io/projected/ec5fc027-5dcc-47d7-972a-ddf14c314725-kube-api-access-7g9gr\") pod \"ec5fc027-5dcc-47d7-972a-ddf14c314725\" (UID: \"ec5fc027-5dcc-47d7-972a-ddf14c314725\") " Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.654315 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bdad584-8c0f-433e-b36d-1a8584cecc18-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bdad584-8c0f-433e-b36d-1a8584cecc18" (UID: "6bdad584-8c0f-433e-b36d-1a8584cecc18"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.654494 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec5fc027-5dcc-47d7-972a-ddf14c314725-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec5fc027-5dcc-47d7-972a-ddf14c314725" (UID: "ec5fc027-5dcc-47d7-972a-ddf14c314725"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.655277 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdad584-8c0f-433e-b36d-1a8584cecc18-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.655317 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f714661-d55e-4c8f-b2c2-8420206b1a72-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.655335 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec5fc027-5dcc-47d7-972a-ddf14c314725-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.655131 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd3660ec-5394-45d9-ae35-5c23e4749178-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd3660ec-5394-45d9-ae35-5c23e4749178" (UID: "fd3660ec-5394-45d9-ae35-5c23e4749178"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.661841 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd3660ec-5394-45d9-ae35-5c23e4749178-kube-api-access-7m4kh" (OuterVolumeSpecName: "kube-api-access-7m4kh") pod "fd3660ec-5394-45d9-ae35-5c23e4749178" (UID: "fd3660ec-5394-45d9-ae35-5c23e4749178"). InnerVolumeSpecName "kube-api-access-7m4kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.661849 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f714661-d55e-4c8f-b2c2-8420206b1a72-kube-api-access-ptkpb" (OuterVolumeSpecName: "kube-api-access-ptkpb") pod "3f714661-d55e-4c8f-b2c2-8420206b1a72" (UID: "3f714661-d55e-4c8f-b2c2-8420206b1a72"). InnerVolumeSpecName "kube-api-access-ptkpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.661925 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bdad584-8c0f-433e-b36d-1a8584cecc18-kube-api-access-ww6th" (OuterVolumeSpecName: "kube-api-access-ww6th") pod "6bdad584-8c0f-433e-b36d-1a8584cecc18" (UID: "6bdad584-8c0f-433e-b36d-1a8584cecc18"). InnerVolumeSpecName "kube-api-access-ww6th". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.661987 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec5fc027-5dcc-47d7-972a-ddf14c314725-kube-api-access-7g9gr" (OuterVolumeSpecName: "kube-api-access-7g9gr") pod "ec5fc027-5dcc-47d7-972a-ddf14c314725" (UID: "ec5fc027-5dcc-47d7-972a-ddf14c314725"). InnerVolumeSpecName "kube-api-access-7g9gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.756975 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptkpb\" (UniqueName: \"kubernetes.io/projected/3f714661-d55e-4c8f-b2c2-8420206b1a72-kube-api-access-ptkpb\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.757007 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m4kh\" (UniqueName: \"kubernetes.io/projected/fd3660ec-5394-45d9-ae35-5c23e4749178-kube-api-access-7m4kh\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.757017 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd3660ec-5394-45d9-ae35-5c23e4749178-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.757200 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g9gr\" (UniqueName: \"kubernetes.io/projected/ec5fc027-5dcc-47d7-972a-ddf14c314725-kube-api-access-7g9gr\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.757265 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww6th\" (UniqueName: \"kubernetes.io/projected/6bdad584-8c0f-433e-b36d-1a8584cecc18-kube-api-access-ww6th\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.978804 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jf4sp" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.978793 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jf4sp" event={"ID":"ec5fc027-5dcc-47d7-972a-ddf14c314725","Type":"ContainerDied","Data":"1ad0b4131f61d557e8efe9888d253d366a3cea452d642eeb70009ca58cf1de0d"} Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.979199 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ad0b4131f61d557e8efe9888d253d366a3cea452d642eeb70009ca58cf1de0d" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.979806 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-18c3-account-create-update-bb8tv" event={"ID":"3f714661-d55e-4c8f-b2c2-8420206b1a72","Type":"ContainerDied","Data":"730ba921b5b8bd608da2821825fe0ca19397c07610ed7bf14f3ea3dec71f248b"} Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.979835 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="730ba921b5b8bd608da2821825fe0ca19397c07610ed7bf14f3ea3dec71f248b" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.979977 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-18c3-account-create-update-bb8tv" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.981546 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9cg97" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.981541 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9cg97" event={"ID":"07cd4eec-ab95-4246-ab07-30bd4b8d6b9e","Type":"ContainerDied","Data":"55f727a603483fa958c83d8909b4758b49f4add156590a20f43b77f17b050b65"} Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.981603 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55f727a603483fa958c83d8909b4758b49f4add156590a20f43b77f17b050b65" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.983409 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-czqdq" Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.983436 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-czqdq" event={"ID":"fd3660ec-5394-45d9-ae35-5c23e4749178","Type":"ContainerDied","Data":"1d6aa9e993da41a85b870793a751b94812816ab2d015eddbf3307bd3b2c1d83b"} Jan 31 09:17:50 crc kubenswrapper[4783]: I0131 09:17:50.983470 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d6aa9e993da41a85b870793a751b94812816ab2d015eddbf3307bd3b2c1d83b" Jan 31 09:17:51 crc kubenswrapper[4783]: I0131 09:17:51.002690 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b696-account-create-update-4xlwd" event={"ID":"6bdad584-8c0f-433e-b36d-1a8584cecc18","Type":"ContainerDied","Data":"021287c124a6fc5992488e6ded2851e4ebca76f4081c99f274d5621a5f6c8e4c"} Jan 31 09:17:51 crc kubenswrapper[4783]: I0131 09:17:51.002732 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="021287c124a6fc5992488e6ded2851e4ebca76f4081c99f274d5621a5f6c8e4c" Jan 31 09:17:51 crc kubenswrapper[4783]: I0131 09:17:51.002798 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b696-account-create-update-4xlwd" Jan 31 09:17:51 crc kubenswrapper[4783]: I0131 09:17:51.658916 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="009682a7-982d-4cc3-9d7f-704c0c7c8d84" path="/var/lib/kubelet/pods/009682a7-982d-4cc3-9d7f-704c0c7c8d84/volumes" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.012067 4783 generic.go:334] "Generic (PLEG): container finished" podID="6827ccb1-8fcf-4451-a878-25d3d5765ae6" containerID="85538127a05d852d4d1cb118623cdc93eb3c6bfaf54e6ebd4130e0e33b36cb19" exitCode=0 Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.012156 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-58vfp" event={"ID":"6827ccb1-8fcf-4451-a878-25d3d5765ae6","Type":"ContainerDied","Data":"85538127a05d852d4d1cb118623cdc93eb3c6bfaf54e6ebd4130e0e33b36cb19"} Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.596186 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-npz7m"] Jan 31 09:17:52 crc kubenswrapper[4783]: E0131 09:17:52.596924 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f714661-d55e-4c8f-b2c2-8420206b1a72" containerName="mariadb-account-create-update" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.596941 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f714661-d55e-4c8f-b2c2-8420206b1a72" containerName="mariadb-account-create-update" Jan 31 09:17:52 crc kubenswrapper[4783]: E0131 09:17:52.596968 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd3660ec-5394-45d9-ae35-5c23e4749178" containerName="mariadb-database-create" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.596976 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd3660ec-5394-45d9-ae35-5c23e4749178" containerName="mariadb-database-create" Jan 31 09:17:52 crc kubenswrapper[4783]: E0131 09:17:52.596990 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009682a7-982d-4cc3-9d7f-704c0c7c8d84" containerName="dnsmasq-dns" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.596997 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="009682a7-982d-4cc3-9d7f-704c0c7c8d84" containerName="dnsmasq-dns" Jan 31 09:17:52 crc kubenswrapper[4783]: E0131 09:17:52.597007 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffccfe82-01de-494d-ba4a-a59e4c242a7c" containerName="mariadb-account-create-update" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.597013 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffccfe82-01de-494d-ba4a-a59e4c242a7c" containerName="mariadb-account-create-update" Jan 31 09:17:52 crc kubenswrapper[4783]: E0131 09:17:52.597040 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07cd4eec-ab95-4246-ab07-30bd4b8d6b9e" containerName="mariadb-database-create" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.597047 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="07cd4eec-ab95-4246-ab07-30bd4b8d6b9e" containerName="mariadb-database-create" Jan 31 09:17:52 crc kubenswrapper[4783]: E0131 09:17:52.597059 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bdad584-8c0f-433e-b36d-1a8584cecc18" containerName="mariadb-account-create-update" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.597066 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bdad584-8c0f-433e-b36d-1a8584cecc18" containerName="mariadb-account-create-update" Jan 31 09:17:52 crc kubenswrapper[4783]: E0131 09:17:52.597078 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5fc027-5dcc-47d7-972a-ddf14c314725" containerName="mariadb-database-create" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.597084 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5fc027-5dcc-47d7-972a-ddf14c314725" containerName="mariadb-database-create" Jan 31 09:17:52 crc kubenswrapper[4783]: E0131 09:17:52.597099 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009682a7-982d-4cc3-9d7f-704c0c7c8d84" containerName="init" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.597105 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="009682a7-982d-4cc3-9d7f-704c0c7c8d84" containerName="init" Jan 31 09:17:52 crc kubenswrapper[4783]: E0131 09:17:52.597116 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="046f6bf6-2c59-43f9-8964-5949209241b5" containerName="mariadb-account-create-update" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.597122 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="046f6bf6-2c59-43f9-8964-5949209241b5" containerName="mariadb-account-create-update" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.597272 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec5fc027-5dcc-47d7-972a-ddf14c314725" containerName="mariadb-database-create" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.597284 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bdad584-8c0f-433e-b36d-1a8584cecc18" containerName="mariadb-account-create-update" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.597293 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffccfe82-01de-494d-ba4a-a59e4c242a7c" containerName="mariadb-account-create-update" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.597306 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f714661-d55e-4c8f-b2c2-8420206b1a72" containerName="mariadb-account-create-update" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.597314 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="046f6bf6-2c59-43f9-8964-5949209241b5" containerName="mariadb-account-create-update" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.597324 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd3660ec-5394-45d9-ae35-5c23e4749178" containerName="mariadb-database-create" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.597331 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="07cd4eec-ab95-4246-ab07-30bd4b8d6b9e" containerName="mariadb-database-create" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.597340 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="009682a7-982d-4cc3-9d7f-704c0c7c8d84" containerName="dnsmasq-dns" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.597866 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-npz7m" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.599605 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-cbsh2" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.608249 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-npz7m"] Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.608537 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.693143 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llnbl\" (UniqueName: \"kubernetes.io/projected/64f2608f-7802-42b7-bc8a-2b1dbf829514-kube-api-access-llnbl\") pod \"glance-db-sync-npz7m\" (UID: \"64f2608f-7802-42b7-bc8a-2b1dbf829514\") " pod="openstack/glance-db-sync-npz7m" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.693403 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-db-sync-config-data\") pod \"glance-db-sync-npz7m\" (UID: \"64f2608f-7802-42b7-bc8a-2b1dbf829514\") " pod="openstack/glance-db-sync-npz7m" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.693454 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-config-data\") pod \"glance-db-sync-npz7m\" (UID: \"64f2608f-7802-42b7-bc8a-2b1dbf829514\") " pod="openstack/glance-db-sync-npz7m" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.693537 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-combined-ca-bundle\") pod \"glance-db-sync-npz7m\" (UID: \"64f2608f-7802-42b7-bc8a-2b1dbf829514\") " pod="openstack/glance-db-sync-npz7m" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.795408 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-db-sync-config-data\") pod \"glance-db-sync-npz7m\" (UID: \"64f2608f-7802-42b7-bc8a-2b1dbf829514\") " pod="openstack/glance-db-sync-npz7m" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.795468 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-config-data\") pod \"glance-db-sync-npz7m\" (UID: \"64f2608f-7802-42b7-bc8a-2b1dbf829514\") " pod="openstack/glance-db-sync-npz7m" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.795546 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-combined-ca-bundle\") pod \"glance-db-sync-npz7m\" (UID: \"64f2608f-7802-42b7-bc8a-2b1dbf829514\") " pod="openstack/glance-db-sync-npz7m" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.795607 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llnbl\" (UniqueName: \"kubernetes.io/projected/64f2608f-7802-42b7-bc8a-2b1dbf829514-kube-api-access-llnbl\") pod \"glance-db-sync-npz7m\" (UID: \"64f2608f-7802-42b7-bc8a-2b1dbf829514\") " pod="openstack/glance-db-sync-npz7m" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.803842 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-db-sync-config-data\") pod \"glance-db-sync-npz7m\" (UID: \"64f2608f-7802-42b7-bc8a-2b1dbf829514\") " pod="openstack/glance-db-sync-npz7m" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.803851 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-combined-ca-bundle\") pod \"glance-db-sync-npz7m\" (UID: \"64f2608f-7802-42b7-bc8a-2b1dbf829514\") " pod="openstack/glance-db-sync-npz7m" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.803860 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-config-data\") pod \"glance-db-sync-npz7m\" (UID: \"64f2608f-7802-42b7-bc8a-2b1dbf829514\") " pod="openstack/glance-db-sync-npz7m" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.809028 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llnbl\" (UniqueName: \"kubernetes.io/projected/64f2608f-7802-42b7-bc8a-2b1dbf829514-kube-api-access-llnbl\") pod \"glance-db-sync-npz7m\" (UID: \"64f2608f-7802-42b7-bc8a-2b1dbf829514\") " pod="openstack/glance-db-sync-npz7m" Jan 31 09:17:52 crc kubenswrapper[4783]: I0131 09:17:52.913441 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-npz7m" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.302811 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.387121 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-npz7m"] Jan 31 09:17:53 crc kubenswrapper[4783]: W0131 09:17:53.388559 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64f2608f_7802_42b7_bc8a_2b1dbf829514.slice/crio-bce7e05628b7c85f32f28307c695254461518d2365a7ffa23381ce7a14971988 WatchSource:0}: Error finding container bce7e05628b7c85f32f28307c695254461518d2365a7ffa23381ce7a14971988: Status 404 returned error can't find the container with id bce7e05628b7c85f32f28307c695254461518d2365a7ffa23381ce7a14971988 Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.405987 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-swiftconf\") pod \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.406071 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6827ccb1-8fcf-4451-a878-25d3d5765ae6-scripts\") pod \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.406192 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-dispersionconf\") pod \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.406234 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfk6p\" (UniqueName: \"kubernetes.io/projected/6827ccb1-8fcf-4451-a878-25d3d5765ae6-kube-api-access-jfk6p\") pod \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.406273 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-combined-ca-bundle\") pod \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.406310 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6827ccb1-8fcf-4451-a878-25d3d5765ae6-ring-data-devices\") pod \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.406443 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6827ccb1-8fcf-4451-a878-25d3d5765ae6-etc-swift\") pod \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\" (UID: \"6827ccb1-8fcf-4451-a878-25d3d5765ae6\") " Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.407464 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6827ccb1-8fcf-4451-a878-25d3d5765ae6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6827ccb1-8fcf-4451-a878-25d3d5765ae6" (UID: "6827ccb1-8fcf-4451-a878-25d3d5765ae6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.407691 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6827ccb1-8fcf-4451-a878-25d3d5765ae6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6827ccb1-8fcf-4451-a878-25d3d5765ae6" (UID: "6827ccb1-8fcf-4451-a878-25d3d5765ae6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.411230 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6827ccb1-8fcf-4451-a878-25d3d5765ae6-kube-api-access-jfk6p" (OuterVolumeSpecName: "kube-api-access-jfk6p") pod "6827ccb1-8fcf-4451-a878-25d3d5765ae6" (UID: "6827ccb1-8fcf-4451-a878-25d3d5765ae6"). InnerVolumeSpecName "kube-api-access-jfk6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.413688 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6827ccb1-8fcf-4451-a878-25d3d5765ae6" (UID: "6827ccb1-8fcf-4451-a878-25d3d5765ae6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.425646 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6827ccb1-8fcf-4451-a878-25d3d5765ae6" (UID: "6827ccb1-8fcf-4451-a878-25d3d5765ae6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.426458 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6827ccb1-8fcf-4451-a878-25d3d5765ae6" (UID: "6827ccb1-8fcf-4451-a878-25d3d5765ae6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.428763 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6827ccb1-8fcf-4451-a878-25d3d5765ae6-scripts" (OuterVolumeSpecName: "scripts") pod "6827ccb1-8fcf-4451-a878-25d3d5765ae6" (UID: "6827ccb1-8fcf-4451-a878-25d3d5765ae6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.508602 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfk6p\" (UniqueName: \"kubernetes.io/projected/6827ccb1-8fcf-4451-a878-25d3d5765ae6-kube-api-access-jfk6p\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.508635 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.508647 4783 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6827ccb1-8fcf-4451-a878-25d3d5765ae6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.508659 4783 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6827ccb1-8fcf-4451-a878-25d3d5765ae6-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.508670 4783 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.508679 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6827ccb1-8fcf-4451-a878-25d3d5765ae6-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.508689 4783 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6827ccb1-8fcf-4451-a878-25d3d5765ae6-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.946088 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bmgzl"] Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.950903 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bmgzl"] Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.963533 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-x7h7f"] Jan 31 09:17:53 crc kubenswrapper[4783]: E0131 09:17:53.963883 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6827ccb1-8fcf-4451-a878-25d3d5765ae6" containerName="swift-ring-rebalance" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.963895 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="6827ccb1-8fcf-4451-a878-25d3d5765ae6" containerName="swift-ring-rebalance" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.964029 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="6827ccb1-8fcf-4451-a878-25d3d5765ae6" containerName="swift-ring-rebalance" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.964568 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x7h7f" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.966068 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 31 09:17:53 crc kubenswrapper[4783]: I0131 09:17:53.969250 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-x7h7f"] Jan 31 09:17:54 crc kubenswrapper[4783]: I0131 09:17:54.019751 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c6c47dd-3744-4c2e-84cd-8b6278d16ee3-operator-scripts\") pod \"root-account-create-update-x7h7f\" (UID: \"7c6c47dd-3744-4c2e-84cd-8b6278d16ee3\") " pod="openstack/root-account-create-update-x7h7f" Jan 31 09:17:54 crc kubenswrapper[4783]: I0131 09:17:54.019810 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bj8j\" (UniqueName: \"kubernetes.io/projected/7c6c47dd-3744-4c2e-84cd-8b6278d16ee3-kube-api-access-9bj8j\") pod \"root-account-create-update-x7h7f\" (UID: \"7c6c47dd-3744-4c2e-84cd-8b6278d16ee3\") " pod="openstack/root-account-create-update-x7h7f" Jan 31 09:17:54 crc kubenswrapper[4783]: I0131 09:17:54.042921 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-npz7m" event={"ID":"64f2608f-7802-42b7-bc8a-2b1dbf829514","Type":"ContainerStarted","Data":"bce7e05628b7c85f32f28307c695254461518d2365a7ffa23381ce7a14971988"} Jan 31 09:17:54 crc kubenswrapper[4783]: I0131 09:17:54.044413 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-58vfp" event={"ID":"6827ccb1-8fcf-4451-a878-25d3d5765ae6","Type":"ContainerDied","Data":"d3d69d44673ede734038ea42db5c20bacf6e7348dc98495b73b754bfd648ddfc"} Jan 31 09:17:54 crc kubenswrapper[4783]: I0131 09:17:54.044448 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3d69d44673ede734038ea42db5c20bacf6e7348dc98495b73b754bfd648ddfc" Jan 31 09:17:54 crc kubenswrapper[4783]: I0131 09:17:54.044519 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-58vfp" Jan 31 09:17:54 crc kubenswrapper[4783]: I0131 09:17:54.123146 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c6c47dd-3744-4c2e-84cd-8b6278d16ee3-operator-scripts\") pod \"root-account-create-update-x7h7f\" (UID: \"7c6c47dd-3744-4c2e-84cd-8b6278d16ee3\") " pod="openstack/root-account-create-update-x7h7f" Jan 31 09:17:54 crc kubenswrapper[4783]: I0131 09:17:54.123283 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bj8j\" (UniqueName: \"kubernetes.io/projected/7c6c47dd-3744-4c2e-84cd-8b6278d16ee3-kube-api-access-9bj8j\") pod \"root-account-create-update-x7h7f\" (UID: \"7c6c47dd-3744-4c2e-84cd-8b6278d16ee3\") " pod="openstack/root-account-create-update-x7h7f" Jan 31 09:17:54 crc kubenswrapper[4783]: I0131 09:17:54.124576 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c6c47dd-3744-4c2e-84cd-8b6278d16ee3-operator-scripts\") pod \"root-account-create-update-x7h7f\" (UID: \"7c6c47dd-3744-4c2e-84cd-8b6278d16ee3\") " pod="openstack/root-account-create-update-x7h7f" Jan 31 09:17:54 crc kubenswrapper[4783]: I0131 09:17:54.147267 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bj8j\" (UniqueName: \"kubernetes.io/projected/7c6c47dd-3744-4c2e-84cd-8b6278d16ee3-kube-api-access-9bj8j\") pod \"root-account-create-update-x7h7f\" (UID: \"7c6c47dd-3744-4c2e-84cd-8b6278d16ee3\") " pod="openstack/root-account-create-update-x7h7f" Jan 31 09:17:54 crc kubenswrapper[4783]: I0131 09:17:54.280068 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x7h7f" Jan 31 09:17:54 crc kubenswrapper[4783]: I0131 09:17:54.796979 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-x7h7f"] Jan 31 09:17:54 crc kubenswrapper[4783]: W0131 09:17:54.804098 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c6c47dd_3744_4c2e_84cd_8b6278d16ee3.slice/crio-ba8f583a7162db2e90635a004417ee5914ca1557744dfb28d42bb9bb96e2bd34 WatchSource:0}: Error finding container ba8f583a7162db2e90635a004417ee5914ca1557744dfb28d42bb9bb96e2bd34: Status 404 returned error can't find the container with id ba8f583a7162db2e90635a004417ee5914ca1557744dfb28d42bb9bb96e2bd34 Jan 31 09:17:55 crc kubenswrapper[4783]: I0131 09:17:55.054893 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x7h7f" event={"ID":"7c6c47dd-3744-4c2e-84cd-8b6278d16ee3","Type":"ContainerStarted","Data":"8cddf7ac4a81c13114c84391a5017d5aca00eecc478930b10b608a8c2b1f6ddc"} Jan 31 09:17:55 crc kubenswrapper[4783]: I0131 09:17:55.054978 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x7h7f" event={"ID":"7c6c47dd-3744-4c2e-84cd-8b6278d16ee3","Type":"ContainerStarted","Data":"ba8f583a7162db2e90635a004417ee5914ca1557744dfb28d42bb9bb96e2bd34"} Jan 31 09:17:55 crc kubenswrapper[4783]: I0131 09:17:55.071398 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-x7h7f" podStartSLOduration=2.071383329 podStartE2EDuration="2.071383329s" podCreationTimestamp="2026-01-31 09:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:17:55.068933372 +0000 UTC m=+785.737616840" watchObservedRunningTime="2026-01-31 09:17:55.071383329 +0000 UTC m=+785.740066797" Jan 31 09:17:55 crc kubenswrapper[4783]: I0131 09:17:55.654200 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffccfe82-01de-494d-ba4a-a59e4c242a7c" path="/var/lib/kubelet/pods/ffccfe82-01de-494d-ba4a-a59e4c242a7c/volumes" Jan 31 09:17:55 crc kubenswrapper[4783]: I0131 09:17:55.962860 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:55 crc kubenswrapper[4783]: I0131 09:17:55.973391 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c78f0039-d432-4056-a572-d3049488bb75-etc-swift\") pod \"swift-storage-0\" (UID: \"c78f0039-d432-4056-a572-d3049488bb75\") " pod="openstack/swift-storage-0" Jan 31 09:17:55 crc kubenswrapper[4783]: I0131 09:17:55.989078 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 31 09:17:56 crc kubenswrapper[4783]: I0131 09:17:56.078018 4783 generic.go:334] "Generic (PLEG): container finished" podID="7c6c47dd-3744-4c2e-84cd-8b6278d16ee3" containerID="8cddf7ac4a81c13114c84391a5017d5aca00eecc478930b10b608a8c2b1f6ddc" exitCode=0 Jan 31 09:17:56 crc kubenswrapper[4783]: I0131 09:17:56.078050 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x7h7f" event={"ID":"7c6c47dd-3744-4c2e-84cd-8b6278d16ee3","Type":"ContainerDied","Data":"8cddf7ac4a81c13114c84391a5017d5aca00eecc478930b10b608a8c2b1f6ddc"} Jan 31 09:17:56 crc kubenswrapper[4783]: I0131 09:17:56.468709 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 31 09:17:57 crc kubenswrapper[4783]: I0131 09:17:57.092442 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c78f0039-d432-4056-a572-d3049488bb75","Type":"ContainerStarted","Data":"bf25ffddf8f39407eda34dfb6b5e7d6f68a7ed5fee2bf97335a04c2e3838f2f0"} Jan 31 09:17:57 crc kubenswrapper[4783]: I0131 09:17:57.391337 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x7h7f" Jan 31 09:17:57 crc kubenswrapper[4783]: I0131 09:17:57.486710 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c6c47dd-3744-4c2e-84cd-8b6278d16ee3-operator-scripts\") pod \"7c6c47dd-3744-4c2e-84cd-8b6278d16ee3\" (UID: \"7c6c47dd-3744-4c2e-84cd-8b6278d16ee3\") " Jan 31 09:17:57 crc kubenswrapper[4783]: I0131 09:17:57.486851 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bj8j\" (UniqueName: \"kubernetes.io/projected/7c6c47dd-3744-4c2e-84cd-8b6278d16ee3-kube-api-access-9bj8j\") pod \"7c6c47dd-3744-4c2e-84cd-8b6278d16ee3\" (UID: \"7c6c47dd-3744-4c2e-84cd-8b6278d16ee3\") " Jan 31 09:17:57 crc kubenswrapper[4783]: I0131 09:17:57.487642 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c6c47dd-3744-4c2e-84cd-8b6278d16ee3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c6c47dd-3744-4c2e-84cd-8b6278d16ee3" (UID: "7c6c47dd-3744-4c2e-84cd-8b6278d16ee3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:57 crc kubenswrapper[4783]: I0131 09:17:57.492044 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c6c47dd-3744-4c2e-84cd-8b6278d16ee3-kube-api-access-9bj8j" (OuterVolumeSpecName: "kube-api-access-9bj8j") pod "7c6c47dd-3744-4c2e-84cd-8b6278d16ee3" (UID: "7c6c47dd-3744-4c2e-84cd-8b6278d16ee3"). InnerVolumeSpecName "kube-api-access-9bj8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:57 crc kubenswrapper[4783]: I0131 09:17:57.590626 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bj8j\" (UniqueName: \"kubernetes.io/projected/7c6c47dd-3744-4c2e-84cd-8b6278d16ee3-kube-api-access-9bj8j\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:57 crc kubenswrapper[4783]: I0131 09:17:57.590665 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c6c47dd-3744-4c2e-84cd-8b6278d16ee3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:58 crc kubenswrapper[4783]: I0131 09:17:58.106654 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x7h7f" Jan 31 09:17:58 crc kubenswrapper[4783]: I0131 09:17:58.106644 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x7h7f" event={"ID":"7c6c47dd-3744-4c2e-84cd-8b6278d16ee3","Type":"ContainerDied","Data":"ba8f583a7162db2e90635a004417ee5914ca1557744dfb28d42bb9bb96e2bd34"} Jan 31 09:17:58 crc kubenswrapper[4783]: I0131 09:17:58.107018 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba8f583a7162db2e90635a004417ee5914ca1557744dfb28d42bb9bb96e2bd34" Jan 31 09:17:58 crc kubenswrapper[4783]: I0131 09:17:58.110435 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c78f0039-d432-4056-a572-d3049488bb75","Type":"ContainerStarted","Data":"ea632d3d8ec75b5c0a1eb23849ee1a04d0935dc3199e4a46345a7b2c7e5cb740"} Jan 31 09:17:58 crc kubenswrapper[4783]: I0131 09:17:58.110488 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c78f0039-d432-4056-a572-d3049488bb75","Type":"ContainerStarted","Data":"aeb28578c5c177915d73a9b1bf438c96d0a5d9c2f8cfd3e8a516ef5da647cdb8"} Jan 31 09:17:58 crc kubenswrapper[4783]: I0131 09:17:58.110500 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c78f0039-d432-4056-a572-d3049488bb75","Type":"ContainerStarted","Data":"723dfd3597cf02719519e970f6f41b6ff2dc85943cd5db3fe585a7304e24b62b"} Jan 31 09:17:59 crc kubenswrapper[4783]: I0131 09:17:59.124807 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c78f0039-d432-4056-a572-d3049488bb75","Type":"ContainerStarted","Data":"1a644a4b141422c862c3caab41371499e064612d5d095a34d7367db1e0094ff8"} Jan 31 09:18:02 crc kubenswrapper[4783]: I0131 09:18:02.126498 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 31 09:18:02 crc kubenswrapper[4783]: I0131 09:18:02.455548 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rll65" podUID="ea827790-18ef-4c55-8b5f-365ead9b9f6c" containerName="ovn-controller" probeResult="failure" output=< Jan 31 09:18:02 crc kubenswrapper[4783]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 31 09:18:02 crc kubenswrapper[4783]: > Jan 31 09:18:02 crc kubenswrapper[4783]: I0131 09:18:02.505600 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:18:04 crc kubenswrapper[4783]: I0131 09:18:04.173775 4783 generic.go:334] "Generic (PLEG): container finished" podID="1aa1eeb1-d389-4933-a40b-3383b28597c2" containerID="e2b77e2267fe204a89c641d60d263440188dd11b1536b9062ded3cf26ce92800" exitCode=0 Jan 31 09:18:04 crc kubenswrapper[4783]: I0131 09:18:04.173866 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1aa1eeb1-d389-4933-a40b-3383b28597c2","Type":"ContainerDied","Data":"e2b77e2267fe204a89c641d60d263440188dd11b1536b9062ded3cf26ce92800"} Jan 31 09:18:04 crc kubenswrapper[4783]: I0131 09:18:04.176530 4783 generic.go:334] "Generic (PLEG): container finished" podID="e44f3996-11b5-4095-a1f3-e1bc24974386" containerID="e7e18d5ab9b16321ee2a0c8b2935712bc5ad499d23d1ae5dc9633f021df16c76" exitCode=0 Jan 31 09:18:04 crc kubenswrapper[4783]: I0131 09:18:04.176593 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e44f3996-11b5-4095-a1f3-e1bc24974386","Type":"ContainerDied","Data":"e7e18d5ab9b16321ee2a0c8b2935712bc5ad499d23d1ae5dc9633f021df16c76"} Jan 31 09:18:04 crc kubenswrapper[4783]: I0131 09:18:04.179246 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-npz7m" event={"ID":"64f2608f-7802-42b7-bc8a-2b1dbf829514","Type":"ContainerStarted","Data":"f51de41216bf7f21fcebd2b853cabe4c72d511b06aa125c331efed24a22160d9"} Jan 31 09:18:04 crc kubenswrapper[4783]: I0131 09:18:04.233594 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-npz7m" podStartSLOduration=1.91214034 podStartE2EDuration="12.233575909s" podCreationTimestamp="2026-01-31 09:17:52 +0000 UTC" firstStartedPulling="2026-01-31 09:17:53.390683267 +0000 UTC m=+784.059366726" lastFinishedPulling="2026-01-31 09:18:03.712118827 +0000 UTC m=+794.380802295" observedRunningTime="2026-01-31 09:18:04.224045219 +0000 UTC m=+794.892728686" watchObservedRunningTime="2026-01-31 09:18:04.233575909 +0000 UTC m=+794.902259377" Jan 31 09:18:05 crc kubenswrapper[4783]: I0131 09:18:05.191820 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1aa1eeb1-d389-4933-a40b-3383b28597c2","Type":"ContainerStarted","Data":"6d6d3cb2cb002f9fcf548645595b8de8b5379e5124a23c0a84985e52c5add160"} Jan 31 09:18:05 crc kubenswrapper[4783]: I0131 09:18:05.192329 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:18:05 crc kubenswrapper[4783]: I0131 09:18:05.195774 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c78f0039-d432-4056-a572-d3049488bb75","Type":"ContainerStarted","Data":"934e1b6dbcfbabb74316715ae30260ed8a152b079820603ff7899e8b0990afce"} Jan 31 09:18:05 crc kubenswrapper[4783]: I0131 09:18:05.195806 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c78f0039-d432-4056-a572-d3049488bb75","Type":"ContainerStarted","Data":"cebd585129bec0780dc19d6cb496a26e026aa17cf75b358f8830c3494e492389"} Jan 31 09:18:05 crc kubenswrapper[4783]: I0131 09:18:05.195816 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c78f0039-d432-4056-a572-d3049488bb75","Type":"ContainerStarted","Data":"5b6f0a72bde07dfff632f0c752c2fb0247a960a5fcdf326ca1ed312051949032"} Jan 31 09:18:05 crc kubenswrapper[4783]: I0131 09:18:05.195825 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c78f0039-d432-4056-a572-d3049488bb75","Type":"ContainerStarted","Data":"998a8c4c1de759b6740951da58754e987cd3ed35494a1192e4b6d3625ca7e348"} Jan 31 09:18:05 crc kubenswrapper[4783]: I0131 09:18:05.197451 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e44f3996-11b5-4095-a1f3-e1bc24974386","Type":"ContainerStarted","Data":"aed3b74b7093db8a32cf49978aa4e68d9e7e71074ce9ae62f92106c4466f7f8c"} Jan 31 09:18:05 crc kubenswrapper[4783]: I0131 09:18:05.197764 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 31 09:18:05 crc kubenswrapper[4783]: I0131 09:18:05.239190 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=44.979718582 podStartE2EDuration="53.239175566s" podCreationTimestamp="2026-01-31 09:17:12 +0000 UTC" firstStartedPulling="2026-01-31 09:17:22.637906172 +0000 UTC m=+753.306589640" lastFinishedPulling="2026-01-31 09:17:30.897363156 +0000 UTC m=+761.566046624" observedRunningTime="2026-01-31 09:18:05.23484823 +0000 UTC m=+795.903531699" watchObservedRunningTime="2026-01-31 09:18:05.239175566 +0000 UTC m=+795.907859023" Jan 31 09:18:05 crc kubenswrapper[4783]: I0131 09:18:05.241254 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=44.343825001 podStartE2EDuration="53.24124782s" podCreationTimestamp="2026-01-31 09:17:12 +0000 UTC" firstStartedPulling="2026-01-31 09:17:22.003154593 +0000 UTC m=+752.671838062" lastFinishedPulling="2026-01-31 09:17:30.900577413 +0000 UTC m=+761.569260881" observedRunningTime="2026-01-31 09:18:05.212758892 +0000 UTC m=+795.881442360" watchObservedRunningTime="2026-01-31 09:18:05.24124782 +0000 UTC m=+795.909931288" Jan 31 09:18:06 crc kubenswrapper[4783]: I0131 09:18:06.211575 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c78f0039-d432-4056-a572-d3049488bb75","Type":"ContainerStarted","Data":"c40a3b0a38dc12c452287281ca8dd17e0968e6c6af64c3771b5d642f94c71d9d"} Jan 31 09:18:06 crc kubenswrapper[4783]: I0131 09:18:06.211835 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c78f0039-d432-4056-a572-d3049488bb75","Type":"ContainerStarted","Data":"04fa5a6206b8adecdab333826d109e60138ecbe0614893dca86df4a20f82caf7"} Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.229001 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c78f0039-d432-4056-a572-d3049488bb75","Type":"ContainerStarted","Data":"29b05704188b39cea7e150ebed1134d2b14bf3e2bfebb520467be201434b1fb4"} Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.229370 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c78f0039-d432-4056-a572-d3049488bb75","Type":"ContainerStarted","Data":"d6638cc3cd50b1b35a4bca843b8efee406b9bcd569e18dbc61508bd0f54b0dde"} Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.229384 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c78f0039-d432-4056-a572-d3049488bb75","Type":"ContainerStarted","Data":"0bf36f01d6b34c0972ad6de6167ac3e02614d18295de36a3daa6bcee5ac4e09b"} Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.229394 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c78f0039-d432-4056-a572-d3049488bb75","Type":"ContainerStarted","Data":"bbf1ba92513b44211a294b821b623ce2faec7684d6a6d17af112ba14a462e88f"} Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.229401 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c78f0039-d432-4056-a572-d3049488bb75","Type":"ContainerStarted","Data":"35ae3532fe9f79befe7ce07b4818e05a52385c6c57a046ade9b8a078fc407d36"} Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.282563 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=18.853147511 podStartE2EDuration="28.282545683s" podCreationTimestamp="2026-01-31 09:17:39 +0000 UTC" firstStartedPulling="2026-01-31 09:17:56.486609481 +0000 UTC m=+787.155292949" lastFinishedPulling="2026-01-31 09:18:05.916007653 +0000 UTC m=+796.584691121" observedRunningTime="2026-01-31 09:18:07.272376119 +0000 UTC m=+797.941059586" watchObservedRunningTime="2026-01-31 09:18:07.282545683 +0000 UTC m=+797.951229150" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.464987 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rll65" podUID="ea827790-18ef-4c55-8b5f-365ead9b9f6c" containerName="ovn-controller" probeResult="failure" output=< Jan 31 09:18:07 crc kubenswrapper[4783]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 31 09:18:07 crc kubenswrapper[4783]: > Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.511315 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-k7st6" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.633269 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-gzcfn"] Jan 31 09:18:07 crc kubenswrapper[4783]: E0131 09:18:07.633824 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c6c47dd-3744-4c2e-84cd-8b6278d16ee3" containerName="mariadb-account-create-update" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.633842 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c6c47dd-3744-4c2e-84cd-8b6278d16ee3" containerName="mariadb-account-create-update" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.634051 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c6c47dd-3744-4c2e-84cd-8b6278d16ee3" containerName="mariadb-account-create-update" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.635252 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.638611 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.676147 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-gzcfn"] Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.713923 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rll65-config-6rwfs"] Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.715220 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.716976 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.720386 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rll65-config-6rwfs"] Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.753877 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.753947 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.753981 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-config\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.753999 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.754028 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.754241 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxsnz\" (UniqueName: \"kubernetes.io/projected/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-kube-api-access-nxsnz\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.856419 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-log-ovn\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.856467 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d84473d-17b7-4531-9360-0daf2cf6e4ae-scripts\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.856491 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-run\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.856536 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.856559 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-run-ovn\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.856590 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.856611 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fdq7\" (UniqueName: \"kubernetes.io/projected/9d84473d-17b7-4531-9360-0daf2cf6e4ae-kube-api-access-2fdq7\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.856648 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-config\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.856665 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9d84473d-17b7-4531-9360-0daf2cf6e4ae-additional-scripts\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.856687 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.856936 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.857068 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxsnz\" (UniqueName: \"kubernetes.io/projected/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-kube-api-access-nxsnz\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.857469 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-config\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.857627 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.857623 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.857820 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.857983 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.874921 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxsnz\" (UniqueName: \"kubernetes.io/projected/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-kube-api-access-nxsnz\") pod \"dnsmasq-dns-8467b54bcc-gzcfn\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.954005 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.958532 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-run-ovn\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.958571 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fdq7\" (UniqueName: \"kubernetes.io/projected/9d84473d-17b7-4531-9360-0daf2cf6e4ae-kube-api-access-2fdq7\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.958602 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9d84473d-17b7-4531-9360-0daf2cf6e4ae-additional-scripts\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.958676 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-log-ovn\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.958696 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d84473d-17b7-4531-9360-0daf2cf6e4ae-scripts\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.958714 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-run\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.958960 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-run\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.959005 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-run-ovn\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.959867 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-log-ovn\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.959925 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9d84473d-17b7-4531-9360-0daf2cf6e4ae-additional-scripts\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.961744 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d84473d-17b7-4531-9360-0daf2cf6e4ae-scripts\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:07 crc kubenswrapper[4783]: I0131 09:18:07.974915 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fdq7\" (UniqueName: \"kubernetes.io/projected/9d84473d-17b7-4531-9360-0daf2cf6e4ae-kube-api-access-2fdq7\") pod \"ovn-controller-rll65-config-6rwfs\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:08 crc kubenswrapper[4783]: I0131 09:18:08.039827 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:08 crc kubenswrapper[4783]: I0131 09:18:08.411816 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-gzcfn"] Jan 31 09:18:08 crc kubenswrapper[4783]: W0131 09:18:08.418105 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4aff2883_d2b4_4c7a_82f4_0b55bb9bc971.slice/crio-7d0e3bc87a00fc0fd0cda768c7e233d6b1297c23fc41aee1ace730cd8e99591b WatchSource:0}: Error finding container 7d0e3bc87a00fc0fd0cda768c7e233d6b1297c23fc41aee1ace730cd8e99591b: Status 404 returned error can't find the container with id 7d0e3bc87a00fc0fd0cda768c7e233d6b1297c23fc41aee1ace730cd8e99591b Jan 31 09:18:08 crc kubenswrapper[4783]: W0131 09:18:08.479619 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d84473d_17b7_4531_9360_0daf2cf6e4ae.slice/crio-143d3571f9b14c599f449286a175358154df084218857bf657cba3f820870816 WatchSource:0}: Error finding container 143d3571f9b14c599f449286a175358154df084218857bf657cba3f820870816: Status 404 returned error can't find the container with id 143d3571f9b14c599f449286a175358154df084218857bf657cba3f820870816 Jan 31 09:18:08 crc kubenswrapper[4783]: I0131 09:18:08.480426 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rll65-config-6rwfs"] Jan 31 09:18:09 crc kubenswrapper[4783]: I0131 09:18:09.245723 4783 generic.go:334] "Generic (PLEG): container finished" podID="64f2608f-7802-42b7-bc8a-2b1dbf829514" containerID="f51de41216bf7f21fcebd2b853cabe4c72d511b06aa125c331efed24a22160d9" exitCode=0 Jan 31 09:18:09 crc kubenswrapper[4783]: I0131 09:18:09.245832 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-npz7m" event={"ID":"64f2608f-7802-42b7-bc8a-2b1dbf829514","Type":"ContainerDied","Data":"f51de41216bf7f21fcebd2b853cabe4c72d511b06aa125c331efed24a22160d9"} Jan 31 09:18:09 crc kubenswrapper[4783]: I0131 09:18:09.248137 4783 generic.go:334] "Generic (PLEG): container finished" podID="9d84473d-17b7-4531-9360-0daf2cf6e4ae" containerID="d9a7dd089c49c65762431a2e230d81aefcdb23477c5c4143dc88da366e13df57" exitCode=0 Jan 31 09:18:09 crc kubenswrapper[4783]: I0131 09:18:09.248205 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rll65-config-6rwfs" event={"ID":"9d84473d-17b7-4531-9360-0daf2cf6e4ae","Type":"ContainerDied","Data":"d9a7dd089c49c65762431a2e230d81aefcdb23477c5c4143dc88da366e13df57"} Jan 31 09:18:09 crc kubenswrapper[4783]: I0131 09:18:09.248264 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rll65-config-6rwfs" event={"ID":"9d84473d-17b7-4531-9360-0daf2cf6e4ae","Type":"ContainerStarted","Data":"143d3571f9b14c599f449286a175358154df084218857bf657cba3f820870816"} Jan 31 09:18:09 crc kubenswrapper[4783]: I0131 09:18:09.250120 4783 generic.go:334] "Generic (PLEG): container finished" podID="4aff2883-d2b4-4c7a-82f4-0b55bb9bc971" containerID="735323f1d842dafcabfe0be5e2eb36bb1091e9f3e56c533d6e720e7ffb39b0a1" exitCode=0 Jan 31 09:18:09 crc kubenswrapper[4783]: I0131 09:18:09.250214 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" event={"ID":"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971","Type":"ContainerDied","Data":"735323f1d842dafcabfe0be5e2eb36bb1091e9f3e56c533d6e720e7ffb39b0a1"} Jan 31 09:18:09 crc kubenswrapper[4783]: I0131 09:18:09.250278 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" event={"ID":"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971","Type":"ContainerStarted","Data":"7d0e3bc87a00fc0fd0cda768c7e233d6b1297c23fc41aee1ace730cd8e99591b"} Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.260942 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" event={"ID":"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971","Type":"ContainerStarted","Data":"ec23eda118786dc174ba04b8fd50a497ab3370a3a7c5ae160ac71d969edd0ba9"} Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.261407 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.545966 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.564156 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" podStartSLOduration=3.564134258 podStartE2EDuration="3.564134258s" podCreationTimestamp="2026-01-31 09:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:18:10.28549555 +0000 UTC m=+800.954179019" watchObservedRunningTime="2026-01-31 09:18:10.564134258 +0000 UTC m=+801.232817727" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.608733 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-log-ovn\") pod \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.608870 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d84473d-17b7-4531-9360-0daf2cf6e4ae-scripts\") pod \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.608864 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9d84473d-17b7-4531-9360-0daf2cf6e4ae" (UID: "9d84473d-17b7-4531-9360-0daf2cf6e4ae"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.608953 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fdq7\" (UniqueName: \"kubernetes.io/projected/9d84473d-17b7-4531-9360-0daf2cf6e4ae-kube-api-access-2fdq7\") pod \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.609066 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-run-ovn\") pod \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.609117 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9d84473d-17b7-4531-9360-0daf2cf6e4ae-additional-scripts\") pod \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.609186 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-run\") pod \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\" (UID: \"9d84473d-17b7-4531-9360-0daf2cf6e4ae\") " Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.609198 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9d84473d-17b7-4531-9360-0daf2cf6e4ae" (UID: "9d84473d-17b7-4531-9360-0daf2cf6e4ae"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.609302 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-run" (OuterVolumeSpecName: "var-run") pod "9d84473d-17b7-4531-9360-0daf2cf6e4ae" (UID: "9d84473d-17b7-4531-9360-0daf2cf6e4ae"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.609834 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d84473d-17b7-4531-9360-0daf2cf6e4ae-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "9d84473d-17b7-4531-9360-0daf2cf6e4ae" (UID: "9d84473d-17b7-4531-9360-0daf2cf6e4ae"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.610300 4783 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.610327 4783 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9d84473d-17b7-4531-9360-0daf2cf6e4ae-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.610345 4783 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-run\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.610356 4783 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9d84473d-17b7-4531-9360-0daf2cf6e4ae-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.611035 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d84473d-17b7-4531-9360-0daf2cf6e4ae-scripts" (OuterVolumeSpecName: "scripts") pod "9d84473d-17b7-4531-9360-0daf2cf6e4ae" (UID: "9d84473d-17b7-4531-9360-0daf2cf6e4ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.614859 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d84473d-17b7-4531-9360-0daf2cf6e4ae-kube-api-access-2fdq7" (OuterVolumeSpecName: "kube-api-access-2fdq7") pod "9d84473d-17b7-4531-9360-0daf2cf6e4ae" (UID: "9d84473d-17b7-4531-9360-0daf2cf6e4ae"). InnerVolumeSpecName "kube-api-access-2fdq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.659384 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-npz7m" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.711538 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-db-sync-config-data\") pod \"64f2608f-7802-42b7-bc8a-2b1dbf829514\" (UID: \"64f2608f-7802-42b7-bc8a-2b1dbf829514\") " Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.711607 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-config-data\") pod \"64f2608f-7802-42b7-bc8a-2b1dbf829514\" (UID: \"64f2608f-7802-42b7-bc8a-2b1dbf829514\") " Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.711714 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-combined-ca-bundle\") pod \"64f2608f-7802-42b7-bc8a-2b1dbf829514\" (UID: \"64f2608f-7802-42b7-bc8a-2b1dbf829514\") " Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.711748 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llnbl\" (UniqueName: \"kubernetes.io/projected/64f2608f-7802-42b7-bc8a-2b1dbf829514-kube-api-access-llnbl\") pod \"64f2608f-7802-42b7-bc8a-2b1dbf829514\" (UID: \"64f2608f-7802-42b7-bc8a-2b1dbf829514\") " Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.712748 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d84473d-17b7-4531-9360-0daf2cf6e4ae-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.712770 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fdq7\" (UniqueName: \"kubernetes.io/projected/9d84473d-17b7-4531-9360-0daf2cf6e4ae-kube-api-access-2fdq7\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.715102 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "64f2608f-7802-42b7-bc8a-2b1dbf829514" (UID: "64f2608f-7802-42b7-bc8a-2b1dbf829514"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.715795 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64f2608f-7802-42b7-bc8a-2b1dbf829514-kube-api-access-llnbl" (OuterVolumeSpecName: "kube-api-access-llnbl") pod "64f2608f-7802-42b7-bc8a-2b1dbf829514" (UID: "64f2608f-7802-42b7-bc8a-2b1dbf829514"). InnerVolumeSpecName "kube-api-access-llnbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.729844 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64f2608f-7802-42b7-bc8a-2b1dbf829514" (UID: "64f2608f-7802-42b7-bc8a-2b1dbf829514"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.744530 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-config-data" (OuterVolumeSpecName: "config-data") pod "64f2608f-7802-42b7-bc8a-2b1dbf829514" (UID: "64f2608f-7802-42b7-bc8a-2b1dbf829514"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.814551 4783 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.814587 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.814596 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f2608f-7802-42b7-bc8a-2b1dbf829514-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:10 crc kubenswrapper[4783]: I0131 09:18:10.814610 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llnbl\" (UniqueName: \"kubernetes.io/projected/64f2608f-7802-42b7-bc8a-2b1dbf829514-kube-api-access-llnbl\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.268561 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rll65-config-6rwfs" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.268563 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rll65-config-6rwfs" event={"ID":"9d84473d-17b7-4531-9360-0daf2cf6e4ae","Type":"ContainerDied","Data":"143d3571f9b14c599f449286a175358154df084218857bf657cba3f820870816"} Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.269352 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="143d3571f9b14c599f449286a175358154df084218857bf657cba3f820870816" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.271644 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-npz7m" event={"ID":"64f2608f-7802-42b7-bc8a-2b1dbf829514","Type":"ContainerDied","Data":"bce7e05628b7c85f32f28307c695254461518d2365a7ffa23381ce7a14971988"} Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.271695 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bce7e05628b7c85f32f28307c695254461518d2365a7ffa23381ce7a14971988" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.271660 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-npz7m" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.535442 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-gzcfn"] Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.590618 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-tqvtm"] Jan 31 09:18:11 crc kubenswrapper[4783]: E0131 09:18:11.590954 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d84473d-17b7-4531-9360-0daf2cf6e4ae" containerName="ovn-config" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.590973 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d84473d-17b7-4531-9360-0daf2cf6e4ae" containerName="ovn-config" Jan 31 09:18:11 crc kubenswrapper[4783]: E0131 09:18:11.590989 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f2608f-7802-42b7-bc8a-2b1dbf829514" containerName="glance-db-sync" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.590998 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f2608f-7802-42b7-bc8a-2b1dbf829514" containerName="glance-db-sync" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.591147 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="64f2608f-7802-42b7-bc8a-2b1dbf829514" containerName="glance-db-sync" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.591220 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d84473d-17b7-4531-9360-0daf2cf6e4ae" containerName="ovn-config" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.591910 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.603431 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-tqvtm"] Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.656821 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rll65-config-6rwfs"] Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.664264 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rll65-config-6rwfs"] Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.729514 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-config\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.729578 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-dns-swift-storage-0\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.729610 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-dns-svc\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.729851 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gszsx\" (UniqueName: \"kubernetes.io/projected/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-kube-api-access-gszsx\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.729980 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-ovsdbserver-nb\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.730248 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-ovsdbserver-sb\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.832639 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-config\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.832727 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-dns-swift-storage-0\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.832753 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-dns-svc\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.832803 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gszsx\" (UniqueName: \"kubernetes.io/projected/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-kube-api-access-gszsx\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.832834 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-ovsdbserver-nb\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.832869 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-ovsdbserver-sb\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.833828 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-ovsdbserver-sb\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.834072 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-config\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.834182 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-dns-svc\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.834294 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-ovsdbserver-nb\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.834305 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-dns-swift-storage-0\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.849494 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gszsx\" (UniqueName: \"kubernetes.io/projected/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-kube-api-access-gszsx\") pod \"dnsmasq-dns-56c9bc6f5c-tqvtm\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:11 crc kubenswrapper[4783]: I0131 09:18:11.911666 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:12 crc kubenswrapper[4783]: I0131 09:18:12.323683 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-tqvtm"] Jan 31 09:18:12 crc kubenswrapper[4783]: I0131 09:18:12.466112 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-rll65" Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.293875 4783 generic.go:334] "Generic (PLEG): container finished" podID="8be19e32-ed6c-42b2-9bf7-15bec0bc9696" containerID="fe918a56b93fdc1f4a8d52322f395b62f7e5df453de0c2de89257cfe97572e60" exitCode=0 Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.294008 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" event={"ID":"8be19e32-ed6c-42b2-9bf7-15bec0bc9696","Type":"ContainerDied","Data":"fe918a56b93fdc1f4a8d52322f395b62f7e5df453de0c2de89257cfe97572e60"} Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.294121 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" event={"ID":"8be19e32-ed6c-42b2-9bf7-15bec0bc9696","Type":"ContainerStarted","Data":"8e5248546510818a27ad43090af2118e9fe46b745ac3e5ab08f497e2f97a43a4"} Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.294318 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" podUID="4aff2883-d2b4-4c7a-82f4-0b55bb9bc971" containerName="dnsmasq-dns" containerID="cri-o://ec23eda118786dc174ba04b8fd50a497ab3370a3a7c5ae160ac71d969edd0ba9" gracePeriod=10 Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.655491 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d84473d-17b7-4531-9360-0daf2cf6e4ae" path="/var/lib/kubelet/pods/9d84473d-17b7-4531-9360-0daf2cf6e4ae/volumes" Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.700617 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.775360 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxsnz\" (UniqueName: \"kubernetes.io/projected/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-kube-api-access-nxsnz\") pod \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.775651 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-config\") pod \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.775681 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-dns-svc\") pod \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.775770 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-dns-swift-storage-0\") pod \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.775893 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-ovsdbserver-sb\") pod \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.775943 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-ovsdbserver-nb\") pod \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\" (UID: \"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971\") " Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.781390 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-kube-api-access-nxsnz" (OuterVolumeSpecName: "kube-api-access-nxsnz") pod "4aff2883-d2b4-4c7a-82f4-0b55bb9bc971" (UID: "4aff2883-d2b4-4c7a-82f4-0b55bb9bc971"). InnerVolumeSpecName "kube-api-access-nxsnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.809486 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-config" (OuterVolumeSpecName: "config") pod "4aff2883-d2b4-4c7a-82f4-0b55bb9bc971" (UID: "4aff2883-d2b4-4c7a-82f4-0b55bb9bc971"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.811798 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4aff2883-d2b4-4c7a-82f4-0b55bb9bc971" (UID: "4aff2883-d2b4-4c7a-82f4-0b55bb9bc971"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.813518 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4aff2883-d2b4-4c7a-82f4-0b55bb9bc971" (UID: "4aff2883-d2b4-4c7a-82f4-0b55bb9bc971"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.814681 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4aff2883-d2b4-4c7a-82f4-0b55bb9bc971" (UID: "4aff2883-d2b4-4c7a-82f4-0b55bb9bc971"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.819348 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4aff2883-d2b4-4c7a-82f4-0b55bb9bc971" (UID: "4aff2883-d2b4-4c7a-82f4-0b55bb9bc971"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.877976 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.878002 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxsnz\" (UniqueName: \"kubernetes.io/projected/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-kube-api-access-nxsnz\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.878016 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.878026 4783 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.878036 4783 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:13 crc kubenswrapper[4783]: I0131 09:18:13.878056 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:14 crc kubenswrapper[4783]: I0131 09:18:14.053439 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:18:14 crc kubenswrapper[4783]: I0131 09:18:14.307478 4783 generic.go:334] "Generic (PLEG): container finished" podID="4aff2883-d2b4-4c7a-82f4-0b55bb9bc971" containerID="ec23eda118786dc174ba04b8fd50a497ab3370a3a7c5ae160ac71d969edd0ba9" exitCode=0 Jan 31 09:18:14 crc kubenswrapper[4783]: I0131 09:18:14.307566 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" event={"ID":"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971","Type":"ContainerDied","Data":"ec23eda118786dc174ba04b8fd50a497ab3370a3a7c5ae160ac71d969edd0ba9"} Jan 31 09:18:14 crc kubenswrapper[4783]: I0131 09:18:14.307619 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" event={"ID":"4aff2883-d2b4-4c7a-82f4-0b55bb9bc971","Type":"ContainerDied","Data":"7d0e3bc87a00fc0fd0cda768c7e233d6b1297c23fc41aee1ace730cd8e99591b"} Jan 31 09:18:14 crc kubenswrapper[4783]: I0131 09:18:14.307619 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-gzcfn" Jan 31 09:18:14 crc kubenswrapper[4783]: I0131 09:18:14.307642 4783 scope.go:117] "RemoveContainer" containerID="ec23eda118786dc174ba04b8fd50a497ab3370a3a7c5ae160ac71d969edd0ba9" Jan 31 09:18:14 crc kubenswrapper[4783]: I0131 09:18:14.309682 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" event={"ID":"8be19e32-ed6c-42b2-9bf7-15bec0bc9696","Type":"ContainerStarted","Data":"0125f737588e8a5cc8c31b4b7fe7d54d51b5b515ea722d0c944867e5027fefc3"} Jan 31 09:18:14 crc kubenswrapper[4783]: I0131 09:18:14.309871 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:14 crc kubenswrapper[4783]: I0131 09:18:14.330019 4783 scope.go:117] "RemoveContainer" containerID="735323f1d842dafcabfe0be5e2eb36bb1091e9f3e56c533d6e720e7ffb39b0a1" Jan 31 09:18:14 crc kubenswrapper[4783]: I0131 09:18:14.342112 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" podStartSLOduration=3.34208747 podStartE2EDuration="3.34208747s" podCreationTimestamp="2026-01-31 09:18:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:18:14.338034443 +0000 UTC m=+805.006717910" watchObservedRunningTime="2026-01-31 09:18:14.34208747 +0000 UTC m=+805.010770938" Jan 31 09:18:14 crc kubenswrapper[4783]: I0131 09:18:14.358052 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-gzcfn"] Jan 31 09:18:14 crc kubenswrapper[4783]: I0131 09:18:14.360178 4783 scope.go:117] "RemoveContainer" containerID="ec23eda118786dc174ba04b8fd50a497ab3370a3a7c5ae160ac71d969edd0ba9" Jan 31 09:18:14 crc kubenswrapper[4783]: E0131 09:18:14.360736 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec23eda118786dc174ba04b8fd50a497ab3370a3a7c5ae160ac71d969edd0ba9\": container with ID starting with ec23eda118786dc174ba04b8fd50a497ab3370a3a7c5ae160ac71d969edd0ba9 not found: ID does not exist" containerID="ec23eda118786dc174ba04b8fd50a497ab3370a3a7c5ae160ac71d969edd0ba9" Jan 31 09:18:14 crc kubenswrapper[4783]: I0131 09:18:14.360810 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec23eda118786dc174ba04b8fd50a497ab3370a3a7c5ae160ac71d969edd0ba9"} err="failed to get container status \"ec23eda118786dc174ba04b8fd50a497ab3370a3a7c5ae160ac71d969edd0ba9\": rpc error: code = NotFound desc = could not find container \"ec23eda118786dc174ba04b8fd50a497ab3370a3a7c5ae160ac71d969edd0ba9\": container with ID starting with ec23eda118786dc174ba04b8fd50a497ab3370a3a7c5ae160ac71d969edd0ba9 not found: ID does not exist" Jan 31 09:18:14 crc kubenswrapper[4783]: I0131 09:18:14.360860 4783 scope.go:117] "RemoveContainer" containerID="735323f1d842dafcabfe0be5e2eb36bb1091e9f3e56c533d6e720e7ffb39b0a1" Jan 31 09:18:14 crc kubenswrapper[4783]: E0131 09:18:14.361410 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"735323f1d842dafcabfe0be5e2eb36bb1091e9f3e56c533d6e720e7ffb39b0a1\": container with ID starting with 735323f1d842dafcabfe0be5e2eb36bb1091e9f3e56c533d6e720e7ffb39b0a1 not found: ID does not exist" containerID="735323f1d842dafcabfe0be5e2eb36bb1091e9f3e56c533d6e720e7ffb39b0a1" Jan 31 09:18:14 crc kubenswrapper[4783]: I0131 09:18:14.361480 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"735323f1d842dafcabfe0be5e2eb36bb1091e9f3e56c533d6e720e7ffb39b0a1"} err="failed to get container status \"735323f1d842dafcabfe0be5e2eb36bb1091e9f3e56c533d6e720e7ffb39b0a1\": rpc error: code = NotFound desc = could not find container \"735323f1d842dafcabfe0be5e2eb36bb1091e9f3e56c533d6e720e7ffb39b0a1\": container with ID starting with 735323f1d842dafcabfe0be5e2eb36bb1091e9f3e56c533d6e720e7ffb39b0a1 not found: ID does not exist" Jan 31 09:18:14 crc kubenswrapper[4783]: I0131 09:18:14.362506 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-gzcfn"] Jan 31 09:18:15 crc kubenswrapper[4783]: I0131 09:18:15.654939 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aff2883-d2b4-4c7a-82f4-0b55bb9bc971" path="/var/lib/kubelet/pods/4aff2883-d2b4-4c7a-82f4-0b55bb9bc971/volumes" Jan 31 09:18:21 crc kubenswrapper[4783]: I0131 09:18:21.913324 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:21 crc kubenswrapper[4783]: I0131 09:18:21.955977 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-99hbj"] Jan 31 09:18:21 crc kubenswrapper[4783]: I0131 09:18:21.956200 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" podUID="3de91956-c9f6-4dda-ab39-1bb28e7b16de" containerName="dnsmasq-dns" containerID="cri-o://b2794d18433a398a32d428a7c8f47dac82b451a793c3b64ff22c60f17cd93ede" gracePeriod=10 Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.335663 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.375188 4783 generic.go:334] "Generic (PLEG): container finished" podID="3de91956-c9f6-4dda-ab39-1bb28e7b16de" containerID="b2794d18433a398a32d428a7c8f47dac82b451a793c3b64ff22c60f17cd93ede" exitCode=0 Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.375258 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.375285 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" event={"ID":"3de91956-c9f6-4dda-ab39-1bb28e7b16de","Type":"ContainerDied","Data":"b2794d18433a398a32d428a7c8f47dac82b451a793c3b64ff22c60f17cd93ede"} Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.375346 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-99hbj" event={"ID":"3de91956-c9f6-4dda-ab39-1bb28e7b16de","Type":"ContainerDied","Data":"67e59e16b242b96668036e80e8f6aa10d6665c1c6773e628e7037777b006ffae"} Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.375379 4783 scope.go:117] "RemoveContainer" containerID="b2794d18433a398a32d428a7c8f47dac82b451a793c3b64ff22c60f17cd93ede" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.392778 4783 scope.go:117] "RemoveContainer" containerID="d54082396eb8aa054fd23b440ea190b69885497d0d2a891415cce534c8ff62d2" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.414258 4783 scope.go:117] "RemoveContainer" containerID="b2794d18433a398a32d428a7c8f47dac82b451a793c3b64ff22c60f17cd93ede" Jan 31 09:18:22 crc kubenswrapper[4783]: E0131 09:18:22.414625 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2794d18433a398a32d428a7c8f47dac82b451a793c3b64ff22c60f17cd93ede\": container with ID starting with b2794d18433a398a32d428a7c8f47dac82b451a793c3b64ff22c60f17cd93ede not found: ID does not exist" containerID="b2794d18433a398a32d428a7c8f47dac82b451a793c3b64ff22c60f17cd93ede" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.414729 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2794d18433a398a32d428a7c8f47dac82b451a793c3b64ff22c60f17cd93ede"} err="failed to get container status \"b2794d18433a398a32d428a7c8f47dac82b451a793c3b64ff22c60f17cd93ede\": rpc error: code = NotFound desc = could not find container \"b2794d18433a398a32d428a7c8f47dac82b451a793c3b64ff22c60f17cd93ede\": container with ID starting with b2794d18433a398a32d428a7c8f47dac82b451a793c3b64ff22c60f17cd93ede not found: ID does not exist" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.414818 4783 scope.go:117] "RemoveContainer" containerID="d54082396eb8aa054fd23b440ea190b69885497d0d2a891415cce534c8ff62d2" Jan 31 09:18:22 crc kubenswrapper[4783]: E0131 09:18:22.415096 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d54082396eb8aa054fd23b440ea190b69885497d0d2a891415cce534c8ff62d2\": container with ID starting with d54082396eb8aa054fd23b440ea190b69885497d0d2a891415cce534c8ff62d2 not found: ID does not exist" containerID="d54082396eb8aa054fd23b440ea190b69885497d0d2a891415cce534c8ff62d2" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.415196 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54082396eb8aa054fd23b440ea190b69885497d0d2a891415cce534c8ff62d2"} err="failed to get container status \"d54082396eb8aa054fd23b440ea190b69885497d0d2a891415cce534c8ff62d2\": rpc error: code = NotFound desc = could not find container \"d54082396eb8aa054fd23b440ea190b69885497d0d2a891415cce534c8ff62d2\": container with ID starting with d54082396eb8aa054fd23b440ea190b69885497d0d2a891415cce534c8ff62d2 not found: ID does not exist" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.422293 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-ovsdbserver-sb\") pod \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.422380 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c682h\" (UniqueName: \"kubernetes.io/projected/3de91956-c9f6-4dda-ab39-1bb28e7b16de-kube-api-access-c682h\") pod \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.422595 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-ovsdbserver-nb\") pod \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.422812 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-config\") pod \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.422935 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-dns-svc\") pod \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\" (UID: \"3de91956-c9f6-4dda-ab39-1bb28e7b16de\") " Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.435465 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de91956-c9f6-4dda-ab39-1bb28e7b16de-kube-api-access-c682h" (OuterVolumeSpecName: "kube-api-access-c682h") pod "3de91956-c9f6-4dda-ab39-1bb28e7b16de" (UID: "3de91956-c9f6-4dda-ab39-1bb28e7b16de"). InnerVolumeSpecName "kube-api-access-c682h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.456134 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3de91956-c9f6-4dda-ab39-1bb28e7b16de" (UID: "3de91956-c9f6-4dda-ab39-1bb28e7b16de"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.456191 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3de91956-c9f6-4dda-ab39-1bb28e7b16de" (UID: "3de91956-c9f6-4dda-ab39-1bb28e7b16de"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.459007 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-config" (OuterVolumeSpecName: "config") pod "3de91956-c9f6-4dda-ab39-1bb28e7b16de" (UID: "3de91956-c9f6-4dda-ab39-1bb28e7b16de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.466992 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3de91956-c9f6-4dda-ab39-1bb28e7b16de" (UID: "3de91956-c9f6-4dda-ab39-1bb28e7b16de"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.525058 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.525084 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.525095 4783 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.525106 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3de91956-c9f6-4dda-ab39-1bb28e7b16de-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.525120 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c682h\" (UniqueName: \"kubernetes.io/projected/3de91956-c9f6-4dda-ab39-1bb28e7b16de-kube-api-access-c682h\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.700012 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-99hbj"] Jan 31 09:18:22 crc kubenswrapper[4783]: I0131 09:18:22.704048 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-99hbj"] Jan 31 09:18:23 crc kubenswrapper[4783]: I0131 09:18:23.655043 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3de91956-c9f6-4dda-ab39-1bb28e7b16de" path="/var/lib/kubelet/pods/3de91956-c9f6-4dda-ab39-1bb28e7b16de/volumes" Jan 31 09:18:23 crc kubenswrapper[4783]: I0131 09:18:23.779541 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 31 09:18:23 crc kubenswrapper[4783]: I0131 09:18:23.981639 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-22mw9"] Jan 31 09:18:23 crc kubenswrapper[4783]: E0131 09:18:23.982244 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de91956-c9f6-4dda-ab39-1bb28e7b16de" containerName="init" Jan 31 09:18:23 crc kubenswrapper[4783]: I0131 09:18:23.982311 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de91956-c9f6-4dda-ab39-1bb28e7b16de" containerName="init" Jan 31 09:18:23 crc kubenswrapper[4783]: E0131 09:18:23.982395 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de91956-c9f6-4dda-ab39-1bb28e7b16de" containerName="dnsmasq-dns" Jan 31 09:18:23 crc kubenswrapper[4783]: I0131 09:18:23.982453 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de91956-c9f6-4dda-ab39-1bb28e7b16de" containerName="dnsmasq-dns" Jan 31 09:18:23 crc kubenswrapper[4783]: E0131 09:18:23.982517 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aff2883-d2b4-4c7a-82f4-0b55bb9bc971" containerName="dnsmasq-dns" Jan 31 09:18:23 crc kubenswrapper[4783]: I0131 09:18:23.982570 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aff2883-d2b4-4c7a-82f4-0b55bb9bc971" containerName="dnsmasq-dns" Jan 31 09:18:23 crc kubenswrapper[4783]: E0131 09:18:23.982621 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aff2883-d2b4-4c7a-82f4-0b55bb9bc971" containerName="init" Jan 31 09:18:23 crc kubenswrapper[4783]: I0131 09:18:23.982669 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aff2883-d2b4-4c7a-82f4-0b55bb9bc971" containerName="init" Jan 31 09:18:23 crc kubenswrapper[4783]: I0131 09:18:23.982830 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de91956-c9f6-4dda-ab39-1bb28e7b16de" containerName="dnsmasq-dns" Jan 31 09:18:23 crc kubenswrapper[4783]: I0131 09:18:23.982890 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aff2883-d2b4-4c7a-82f4-0b55bb9bc971" containerName="dnsmasq-dns" Jan 31 09:18:23 crc kubenswrapper[4783]: I0131 09:18:23.983406 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-22mw9" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.003584 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-22mw9"] Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.044826 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjnwr\" (UniqueName: \"kubernetes.io/projected/37863ae7-16e3-4030-9d28-9f9e312e941a-kube-api-access-tjnwr\") pod \"barbican-db-create-22mw9\" (UID: \"37863ae7-16e3-4030-9d28-9f9e312e941a\") " pod="openstack/barbican-db-create-22mw9" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.044926 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37863ae7-16e3-4030-9d28-9f9e312e941a-operator-scripts\") pod \"barbican-db-create-22mw9\" (UID: \"37863ae7-16e3-4030-9d28-9f9e312e941a\") " pod="openstack/barbican-db-create-22mw9" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.082552 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-h5m5k"] Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.084371 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h5m5k" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.091081 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7909-account-create-update-5jfvb"] Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.093095 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7909-account-create-update-5jfvb" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.094584 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.098637 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h5m5k"] Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.103437 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7909-account-create-update-5jfvb"] Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.146598 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swmv6\" (UniqueName: \"kubernetes.io/projected/0ac471b1-537d-4498-90ea-0c8ccb699ae8-kube-api-access-swmv6\") pod \"cinder-7909-account-create-update-5jfvb\" (UID: \"0ac471b1-537d-4498-90ea-0c8ccb699ae8\") " pod="openstack/cinder-7909-account-create-update-5jfvb" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.146669 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjnwr\" (UniqueName: \"kubernetes.io/projected/37863ae7-16e3-4030-9d28-9f9e312e941a-kube-api-access-tjnwr\") pod \"barbican-db-create-22mw9\" (UID: \"37863ae7-16e3-4030-9d28-9f9e312e941a\") " pod="openstack/barbican-db-create-22mw9" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.146690 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de0e2db-9b04-476b-a09f-7cf7415ba0e7-operator-scripts\") pod \"cinder-db-create-h5m5k\" (UID: \"2de0e2db-9b04-476b-a09f-7cf7415ba0e7\") " pod="openstack/cinder-db-create-h5m5k" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.146719 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ac471b1-537d-4498-90ea-0c8ccb699ae8-operator-scripts\") pod \"cinder-7909-account-create-update-5jfvb\" (UID: \"0ac471b1-537d-4498-90ea-0c8ccb699ae8\") " pod="openstack/cinder-7909-account-create-update-5jfvb" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.146743 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37863ae7-16e3-4030-9d28-9f9e312e941a-operator-scripts\") pod \"barbican-db-create-22mw9\" (UID: \"37863ae7-16e3-4030-9d28-9f9e312e941a\") " pod="openstack/barbican-db-create-22mw9" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.146807 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smzds\" (UniqueName: \"kubernetes.io/projected/2de0e2db-9b04-476b-a09f-7cf7415ba0e7-kube-api-access-smzds\") pod \"cinder-db-create-h5m5k\" (UID: \"2de0e2db-9b04-476b-a09f-7cf7415ba0e7\") " pod="openstack/cinder-db-create-h5m5k" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.147619 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37863ae7-16e3-4030-9d28-9f9e312e941a-operator-scripts\") pod \"barbican-db-create-22mw9\" (UID: \"37863ae7-16e3-4030-9d28-9f9e312e941a\") " pod="openstack/barbican-db-create-22mw9" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.166562 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjnwr\" (UniqueName: \"kubernetes.io/projected/37863ae7-16e3-4030-9d28-9f9e312e941a-kube-api-access-tjnwr\") pod \"barbican-db-create-22mw9\" (UID: \"37863ae7-16e3-4030-9d28-9f9e312e941a\") " pod="openstack/barbican-db-create-22mw9" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.187907 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0fd3-account-create-update-9x4ll"] Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.190154 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0fd3-account-create-update-9x4ll" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.196364 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.217052 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0fd3-account-create-update-9x4ll"] Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.248891 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6tzx\" (UniqueName: \"kubernetes.io/projected/181145e5-44a7-448b-b5a3-f6f03f325c01-kube-api-access-n6tzx\") pod \"barbican-0fd3-account-create-update-9x4ll\" (UID: \"181145e5-44a7-448b-b5a3-f6f03f325c01\") " pod="openstack/barbican-0fd3-account-create-update-9x4ll" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.248984 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smzds\" (UniqueName: \"kubernetes.io/projected/2de0e2db-9b04-476b-a09f-7cf7415ba0e7-kube-api-access-smzds\") pod \"cinder-db-create-h5m5k\" (UID: \"2de0e2db-9b04-476b-a09f-7cf7415ba0e7\") " pod="openstack/cinder-db-create-h5m5k" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.249029 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swmv6\" (UniqueName: \"kubernetes.io/projected/0ac471b1-537d-4498-90ea-0c8ccb699ae8-kube-api-access-swmv6\") pod \"cinder-7909-account-create-update-5jfvb\" (UID: \"0ac471b1-537d-4498-90ea-0c8ccb699ae8\") " pod="openstack/cinder-7909-account-create-update-5jfvb" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.249068 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de0e2db-9b04-476b-a09f-7cf7415ba0e7-operator-scripts\") pod \"cinder-db-create-h5m5k\" (UID: \"2de0e2db-9b04-476b-a09f-7cf7415ba0e7\") " pod="openstack/cinder-db-create-h5m5k" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.249087 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181145e5-44a7-448b-b5a3-f6f03f325c01-operator-scripts\") pod \"barbican-0fd3-account-create-update-9x4ll\" (UID: \"181145e5-44a7-448b-b5a3-f6f03f325c01\") " pod="openstack/barbican-0fd3-account-create-update-9x4ll" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.249117 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ac471b1-537d-4498-90ea-0c8ccb699ae8-operator-scripts\") pod \"cinder-7909-account-create-update-5jfvb\" (UID: \"0ac471b1-537d-4498-90ea-0c8ccb699ae8\") " pod="openstack/cinder-7909-account-create-update-5jfvb" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.249751 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ac471b1-537d-4498-90ea-0c8ccb699ae8-operator-scripts\") pod \"cinder-7909-account-create-update-5jfvb\" (UID: \"0ac471b1-537d-4498-90ea-0c8ccb699ae8\") " pod="openstack/cinder-7909-account-create-update-5jfvb" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.250048 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de0e2db-9b04-476b-a09f-7cf7415ba0e7-operator-scripts\") pod \"cinder-db-create-h5m5k\" (UID: \"2de0e2db-9b04-476b-a09f-7cf7415ba0e7\") " pod="openstack/cinder-db-create-h5m5k" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.262901 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swmv6\" (UniqueName: \"kubernetes.io/projected/0ac471b1-537d-4498-90ea-0c8ccb699ae8-kube-api-access-swmv6\") pod \"cinder-7909-account-create-update-5jfvb\" (UID: \"0ac471b1-537d-4498-90ea-0c8ccb699ae8\") " pod="openstack/cinder-7909-account-create-update-5jfvb" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.264661 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smzds\" (UniqueName: \"kubernetes.io/projected/2de0e2db-9b04-476b-a09f-7cf7415ba0e7-kube-api-access-smzds\") pod \"cinder-db-create-h5m5k\" (UID: \"2de0e2db-9b04-476b-a09f-7cf7415ba0e7\") " pod="openstack/cinder-db-create-h5m5k" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.295842 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-22mw9" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.327945 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-p9k2r"] Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.329420 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p9k2r" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.337410 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.337623 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.337920 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.338085 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wzwfs" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.338142 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-p9k2r"] Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.352787 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181145e5-44a7-448b-b5a3-f6f03f325c01-operator-scripts\") pod \"barbican-0fd3-account-create-update-9x4ll\" (UID: \"181145e5-44a7-448b-b5a3-f6f03f325c01\") " pod="openstack/barbican-0fd3-account-create-update-9x4ll" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.352877 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6tzx\" (UniqueName: \"kubernetes.io/projected/181145e5-44a7-448b-b5a3-f6f03f325c01-kube-api-access-n6tzx\") pod \"barbican-0fd3-account-create-update-9x4ll\" (UID: \"181145e5-44a7-448b-b5a3-f6f03f325c01\") " pod="openstack/barbican-0fd3-account-create-update-9x4ll" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.353748 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181145e5-44a7-448b-b5a3-f6f03f325c01-operator-scripts\") pod \"barbican-0fd3-account-create-update-9x4ll\" (UID: \"181145e5-44a7-448b-b5a3-f6f03f325c01\") " pod="openstack/barbican-0fd3-account-create-update-9x4ll" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.383988 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6tzx\" (UniqueName: \"kubernetes.io/projected/181145e5-44a7-448b-b5a3-f6f03f325c01-kube-api-access-n6tzx\") pod \"barbican-0fd3-account-create-update-9x4ll\" (UID: \"181145e5-44a7-448b-b5a3-f6f03f325c01\") " pod="openstack/barbican-0fd3-account-create-update-9x4ll" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.397389 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h5m5k" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.406389 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7909-account-create-update-5jfvb" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.459103 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d851f24-0013-417f-9734-d28d22b27057-config-data\") pod \"keystone-db-sync-p9k2r\" (UID: \"4d851f24-0013-417f-9734-d28d22b27057\") " pod="openstack/keystone-db-sync-p9k2r" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.459240 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks5x7\" (UniqueName: \"kubernetes.io/projected/4d851f24-0013-417f-9734-d28d22b27057-kube-api-access-ks5x7\") pod \"keystone-db-sync-p9k2r\" (UID: \"4d851f24-0013-417f-9734-d28d22b27057\") " pod="openstack/keystone-db-sync-p9k2r" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.459342 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d851f24-0013-417f-9734-d28d22b27057-combined-ca-bundle\") pod \"keystone-db-sync-p9k2r\" (UID: \"4d851f24-0013-417f-9734-d28d22b27057\") " pod="openstack/keystone-db-sync-p9k2r" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.496477 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-xkw48"] Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.499510 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xkw48" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.504829 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-496c-account-create-update-6jkmm"] Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.505635 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-496c-account-create-update-6jkmm" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.506729 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.508735 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0fd3-account-create-update-9x4ll" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.514005 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-496c-account-create-update-6jkmm"] Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.531051 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xkw48"] Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.561678 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d851f24-0013-417f-9734-d28d22b27057-config-data\") pod \"keystone-db-sync-p9k2r\" (UID: \"4d851f24-0013-417f-9734-d28d22b27057\") " pod="openstack/keystone-db-sync-p9k2r" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.561721 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4m7f\" (UniqueName: \"kubernetes.io/projected/a6502457-d446-4d87-ab1b-aae6fd53f95a-kube-api-access-s4m7f\") pod \"neutron-db-create-xkw48\" (UID: \"a6502457-d446-4d87-ab1b-aae6fd53f95a\") " pod="openstack/neutron-db-create-xkw48" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.561751 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks5x7\" (UniqueName: \"kubernetes.io/projected/4d851f24-0013-417f-9734-d28d22b27057-kube-api-access-ks5x7\") pod \"keystone-db-sync-p9k2r\" (UID: \"4d851f24-0013-417f-9734-d28d22b27057\") " pod="openstack/keystone-db-sync-p9k2r" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.561771 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6502457-d446-4d87-ab1b-aae6fd53f95a-operator-scripts\") pod \"neutron-db-create-xkw48\" (UID: \"a6502457-d446-4d87-ab1b-aae6fd53f95a\") " pod="openstack/neutron-db-create-xkw48" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.561823 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d851f24-0013-417f-9734-d28d22b27057-combined-ca-bundle\") pod \"keystone-db-sync-p9k2r\" (UID: \"4d851f24-0013-417f-9734-d28d22b27057\") " pod="openstack/keystone-db-sync-p9k2r" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.561855 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12869632-705f-4514-b2b1-d2eb29bc986b-operator-scripts\") pod \"neutron-496c-account-create-update-6jkmm\" (UID: \"12869632-705f-4514-b2b1-d2eb29bc986b\") " pod="openstack/neutron-496c-account-create-update-6jkmm" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.561884 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kdfc\" (UniqueName: \"kubernetes.io/projected/12869632-705f-4514-b2b1-d2eb29bc986b-kube-api-access-6kdfc\") pod \"neutron-496c-account-create-update-6jkmm\" (UID: \"12869632-705f-4514-b2b1-d2eb29bc986b\") " pod="openstack/neutron-496c-account-create-update-6jkmm" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.575338 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d851f24-0013-417f-9734-d28d22b27057-config-data\") pod \"keystone-db-sync-p9k2r\" (UID: \"4d851f24-0013-417f-9734-d28d22b27057\") " pod="openstack/keystone-db-sync-p9k2r" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.578598 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d851f24-0013-417f-9734-d28d22b27057-combined-ca-bundle\") pod \"keystone-db-sync-p9k2r\" (UID: \"4d851f24-0013-417f-9734-d28d22b27057\") " pod="openstack/keystone-db-sync-p9k2r" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.582708 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks5x7\" (UniqueName: \"kubernetes.io/projected/4d851f24-0013-417f-9734-d28d22b27057-kube-api-access-ks5x7\") pod \"keystone-db-sync-p9k2r\" (UID: \"4d851f24-0013-417f-9734-d28d22b27057\") " pod="openstack/keystone-db-sync-p9k2r" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.664230 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4m7f\" (UniqueName: \"kubernetes.io/projected/a6502457-d446-4d87-ab1b-aae6fd53f95a-kube-api-access-s4m7f\") pod \"neutron-db-create-xkw48\" (UID: \"a6502457-d446-4d87-ab1b-aae6fd53f95a\") " pod="openstack/neutron-db-create-xkw48" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.664513 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6502457-d446-4d87-ab1b-aae6fd53f95a-operator-scripts\") pod \"neutron-db-create-xkw48\" (UID: \"a6502457-d446-4d87-ab1b-aae6fd53f95a\") " pod="openstack/neutron-db-create-xkw48" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.664674 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12869632-705f-4514-b2b1-d2eb29bc986b-operator-scripts\") pod \"neutron-496c-account-create-update-6jkmm\" (UID: \"12869632-705f-4514-b2b1-d2eb29bc986b\") " pod="openstack/neutron-496c-account-create-update-6jkmm" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.664701 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kdfc\" (UniqueName: \"kubernetes.io/projected/12869632-705f-4514-b2b1-d2eb29bc986b-kube-api-access-6kdfc\") pod \"neutron-496c-account-create-update-6jkmm\" (UID: \"12869632-705f-4514-b2b1-d2eb29bc986b\") " pod="openstack/neutron-496c-account-create-update-6jkmm" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.665701 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6502457-d446-4d87-ab1b-aae6fd53f95a-operator-scripts\") pod \"neutron-db-create-xkw48\" (UID: \"a6502457-d446-4d87-ab1b-aae6fd53f95a\") " pod="openstack/neutron-db-create-xkw48" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.665736 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12869632-705f-4514-b2b1-d2eb29bc986b-operator-scripts\") pod \"neutron-496c-account-create-update-6jkmm\" (UID: \"12869632-705f-4514-b2b1-d2eb29bc986b\") " pod="openstack/neutron-496c-account-create-update-6jkmm" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.672969 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p9k2r" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.679244 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4m7f\" (UniqueName: \"kubernetes.io/projected/a6502457-d446-4d87-ab1b-aae6fd53f95a-kube-api-access-s4m7f\") pod \"neutron-db-create-xkw48\" (UID: \"a6502457-d446-4d87-ab1b-aae6fd53f95a\") " pod="openstack/neutron-db-create-xkw48" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.679554 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kdfc\" (UniqueName: \"kubernetes.io/projected/12869632-705f-4514-b2b1-d2eb29bc986b-kube-api-access-6kdfc\") pod \"neutron-496c-account-create-update-6jkmm\" (UID: \"12869632-705f-4514-b2b1-d2eb29bc986b\") " pod="openstack/neutron-496c-account-create-update-6jkmm" Jan 31 09:18:24 crc kubenswrapper[4783]: I0131 09:18:24.760434 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-22mw9"] Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:24.813331 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xkw48" Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:24.824113 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-496c-account-create-update-6jkmm" Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:24.904782 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h5m5k"] Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:24.967020 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7909-account-create-update-5jfvb"] Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:25.022006 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0fd3-account-create-update-9x4ll"] Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:25.400008 4783 generic.go:334] "Generic (PLEG): container finished" podID="37863ae7-16e3-4030-9d28-9f9e312e941a" containerID="3e392250f5a57eeb4ae09672f984aacd6138ff0541a50f2720f57ebdf40b49d2" exitCode=0 Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:25.400095 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-22mw9" event={"ID":"37863ae7-16e3-4030-9d28-9f9e312e941a","Type":"ContainerDied","Data":"3e392250f5a57eeb4ae09672f984aacd6138ff0541a50f2720f57ebdf40b49d2"} Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:25.400281 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-22mw9" event={"ID":"37863ae7-16e3-4030-9d28-9f9e312e941a","Type":"ContainerStarted","Data":"ceff2cdd5c0f364452f469adc2b3d7e5a125c153a40e420664446c4a3e14f10c"} Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:25.404005 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0fd3-account-create-update-9x4ll" event={"ID":"181145e5-44a7-448b-b5a3-f6f03f325c01","Type":"ContainerStarted","Data":"58612a40d7d85fca3d09f85cde23b1b012a2b2dce1840cc535b9f3510b7ebad5"} Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:25.404036 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0fd3-account-create-update-9x4ll" event={"ID":"181145e5-44a7-448b-b5a3-f6f03f325c01","Type":"ContainerStarted","Data":"847ba83f5e915200b52dcae111ad50bf4bd6df02f8aee1518c4aa97870c3c120"} Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:25.405782 4783 generic.go:334] "Generic (PLEG): container finished" podID="0ac471b1-537d-4498-90ea-0c8ccb699ae8" containerID="4c3ad5d4efa5f8761849e7c13dd01b28de263966ea54c345a85746b51cfe6323" exitCode=0 Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:25.405867 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7909-account-create-update-5jfvb" event={"ID":"0ac471b1-537d-4498-90ea-0c8ccb699ae8","Type":"ContainerDied","Data":"4c3ad5d4efa5f8761849e7c13dd01b28de263966ea54c345a85746b51cfe6323"} Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:25.405899 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7909-account-create-update-5jfvb" event={"ID":"0ac471b1-537d-4498-90ea-0c8ccb699ae8","Type":"ContainerStarted","Data":"ba75e4326d33478fcbc10b16dbded38fd7eb061ab2ec30bebef5b75b4ff62dfd"} Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:25.407020 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h5m5k" event={"ID":"2de0e2db-9b04-476b-a09f-7cf7415ba0e7","Type":"ContainerStarted","Data":"34d32922b8a575ca56d1ec1e4a5c7b47e34f1c8ae94ab4f026da18bb69ef8b30"} Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:25.407046 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h5m5k" event={"ID":"2de0e2db-9b04-476b-a09f-7cf7415ba0e7","Type":"ContainerStarted","Data":"7f73012b6f8e6fb1f767d4ac4896863eaa2aa2fb75a0725e21fb9f7e5835bf25"} Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:25.442230 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-h5m5k" podStartSLOduration=1.442213821 podStartE2EDuration="1.442213821s" podCreationTimestamp="2026-01-31 09:18:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:18:25.428611748 +0000 UTC m=+816.097295217" watchObservedRunningTime="2026-01-31 09:18:25.442213821 +0000 UTC m=+816.110897289" Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:25.443282 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-0fd3-account-create-update-9x4ll" podStartSLOduration=1.4432766240000001 podStartE2EDuration="1.443276624s" podCreationTimestamp="2026-01-31 09:18:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:18:25.439823988 +0000 UTC m=+816.108507456" watchObservedRunningTime="2026-01-31 09:18:25.443276624 +0000 UTC m=+816.111960093" Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:25.670247 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-p9k2r"] Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:25.737976 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-496c-account-create-update-6jkmm"] Jan 31 09:18:25 crc kubenswrapper[4783]: W0131 09:18:25.739723 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12869632_705f_4514_b2b1_d2eb29bc986b.slice/crio-4ec080f0525b1605fbd6f107bf1e2887e92a7fc80c74d7375cf0f9515d66b469 WatchSource:0}: Error finding container 4ec080f0525b1605fbd6f107bf1e2887e92a7fc80c74d7375cf0f9515d66b469: Status 404 returned error can't find the container with id 4ec080f0525b1605fbd6f107bf1e2887e92a7fc80c74d7375cf0f9515d66b469 Jan 31 09:18:25 crc kubenswrapper[4783]: W0131 09:18:25.742067 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6502457_d446_4d87_ab1b_aae6fd53f95a.slice/crio-c3fc6d30da16c44e570253e7b37740152a44663508151f10e7beba06a81682fb WatchSource:0}: Error finding container c3fc6d30da16c44e570253e7b37740152a44663508151f10e7beba06a81682fb: Status 404 returned error can't find the container with id c3fc6d30da16c44e570253e7b37740152a44663508151f10e7beba06a81682fb Jan 31 09:18:25 crc kubenswrapper[4783]: I0131 09:18:25.748613 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xkw48"] Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.418387 4783 generic.go:334] "Generic (PLEG): container finished" podID="a6502457-d446-4d87-ab1b-aae6fd53f95a" containerID="76dd931b2b6a05e8c9b2f971eb4750f62ab7a839ea67268ca8fbf2909701b98c" exitCode=0 Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.418523 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xkw48" event={"ID":"a6502457-d446-4d87-ab1b-aae6fd53f95a","Type":"ContainerDied","Data":"76dd931b2b6a05e8c9b2f971eb4750f62ab7a839ea67268ca8fbf2909701b98c"} Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.418729 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xkw48" event={"ID":"a6502457-d446-4d87-ab1b-aae6fd53f95a","Type":"ContainerStarted","Data":"c3fc6d30da16c44e570253e7b37740152a44663508151f10e7beba06a81682fb"} Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.421230 4783 generic.go:334] "Generic (PLEG): container finished" podID="181145e5-44a7-448b-b5a3-f6f03f325c01" containerID="58612a40d7d85fca3d09f85cde23b1b012a2b2dce1840cc535b9f3510b7ebad5" exitCode=0 Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.421483 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0fd3-account-create-update-9x4ll" event={"ID":"181145e5-44a7-448b-b5a3-f6f03f325c01","Type":"ContainerDied","Data":"58612a40d7d85fca3d09f85cde23b1b012a2b2dce1840cc535b9f3510b7ebad5"} Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.423264 4783 generic.go:334] "Generic (PLEG): container finished" podID="12869632-705f-4514-b2b1-d2eb29bc986b" containerID="beb762ada300dca21fab34d7f101d847437b4613ce8f4d1435b4bb65234feb2c" exitCode=0 Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.423319 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-496c-account-create-update-6jkmm" event={"ID":"12869632-705f-4514-b2b1-d2eb29bc986b","Type":"ContainerDied","Data":"beb762ada300dca21fab34d7f101d847437b4613ce8f4d1435b4bb65234feb2c"} Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.423342 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-496c-account-create-update-6jkmm" event={"ID":"12869632-705f-4514-b2b1-d2eb29bc986b","Type":"ContainerStarted","Data":"4ec080f0525b1605fbd6f107bf1e2887e92a7fc80c74d7375cf0f9515d66b469"} Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.424838 4783 generic.go:334] "Generic (PLEG): container finished" podID="2de0e2db-9b04-476b-a09f-7cf7415ba0e7" containerID="34d32922b8a575ca56d1ec1e4a5c7b47e34f1c8ae94ab4f026da18bb69ef8b30" exitCode=0 Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.424889 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h5m5k" event={"ID":"2de0e2db-9b04-476b-a09f-7cf7415ba0e7","Type":"ContainerDied","Data":"34d32922b8a575ca56d1ec1e4a5c7b47e34f1c8ae94ab4f026da18bb69ef8b30"} Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.426221 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p9k2r" event={"ID":"4d851f24-0013-417f-9734-d28d22b27057","Type":"ContainerStarted","Data":"67ea5d053ef03c37b9a55d7b25c0bdcc1f62183453101cf6c1054ce4b9876046"} Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.784720 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7909-account-create-update-5jfvb" Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.788719 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-22mw9" Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.919152 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjnwr\" (UniqueName: \"kubernetes.io/projected/37863ae7-16e3-4030-9d28-9f9e312e941a-kube-api-access-tjnwr\") pod \"37863ae7-16e3-4030-9d28-9f9e312e941a\" (UID: \"37863ae7-16e3-4030-9d28-9f9e312e941a\") " Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.919434 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ac471b1-537d-4498-90ea-0c8ccb699ae8-operator-scripts\") pod \"0ac471b1-537d-4498-90ea-0c8ccb699ae8\" (UID: \"0ac471b1-537d-4498-90ea-0c8ccb699ae8\") " Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.919555 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37863ae7-16e3-4030-9d28-9f9e312e941a-operator-scripts\") pod \"37863ae7-16e3-4030-9d28-9f9e312e941a\" (UID: \"37863ae7-16e3-4030-9d28-9f9e312e941a\") " Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.919627 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swmv6\" (UniqueName: \"kubernetes.io/projected/0ac471b1-537d-4498-90ea-0c8ccb699ae8-kube-api-access-swmv6\") pod \"0ac471b1-537d-4498-90ea-0c8ccb699ae8\" (UID: \"0ac471b1-537d-4498-90ea-0c8ccb699ae8\") " Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.919902 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac471b1-537d-4498-90ea-0c8ccb699ae8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ac471b1-537d-4498-90ea-0c8ccb699ae8" (UID: "0ac471b1-537d-4498-90ea-0c8ccb699ae8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.920028 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37863ae7-16e3-4030-9d28-9f9e312e941a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37863ae7-16e3-4030-9d28-9f9e312e941a" (UID: "37863ae7-16e3-4030-9d28-9f9e312e941a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.920240 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37863ae7-16e3-4030-9d28-9f9e312e941a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.920256 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ac471b1-537d-4498-90ea-0c8ccb699ae8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.926744 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37863ae7-16e3-4030-9d28-9f9e312e941a-kube-api-access-tjnwr" (OuterVolumeSpecName: "kube-api-access-tjnwr") pod "37863ae7-16e3-4030-9d28-9f9e312e941a" (UID: "37863ae7-16e3-4030-9d28-9f9e312e941a"). InnerVolumeSpecName "kube-api-access-tjnwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:26 crc kubenswrapper[4783]: I0131 09:18:26.927393 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac471b1-537d-4498-90ea-0c8ccb699ae8-kube-api-access-swmv6" (OuterVolumeSpecName: "kube-api-access-swmv6") pod "0ac471b1-537d-4498-90ea-0c8ccb699ae8" (UID: "0ac471b1-537d-4498-90ea-0c8ccb699ae8"). InnerVolumeSpecName "kube-api-access-swmv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:27 crc kubenswrapper[4783]: I0131 09:18:27.024925 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjnwr\" (UniqueName: \"kubernetes.io/projected/37863ae7-16e3-4030-9d28-9f9e312e941a-kube-api-access-tjnwr\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:27 crc kubenswrapper[4783]: I0131 09:18:27.025014 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swmv6\" (UniqueName: \"kubernetes.io/projected/0ac471b1-537d-4498-90ea-0c8ccb699ae8-kube-api-access-swmv6\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:27 crc kubenswrapper[4783]: I0131 09:18:27.443083 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7909-account-create-update-5jfvb" event={"ID":"0ac471b1-537d-4498-90ea-0c8ccb699ae8","Type":"ContainerDied","Data":"ba75e4326d33478fcbc10b16dbded38fd7eb061ab2ec30bebef5b75b4ff62dfd"} Jan 31 09:18:27 crc kubenswrapper[4783]: I0131 09:18:27.443183 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba75e4326d33478fcbc10b16dbded38fd7eb061ab2ec30bebef5b75b4ff62dfd" Jan 31 09:18:27 crc kubenswrapper[4783]: I0131 09:18:27.443213 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7909-account-create-update-5jfvb" Jan 31 09:18:27 crc kubenswrapper[4783]: I0131 09:18:27.448645 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-22mw9" event={"ID":"37863ae7-16e3-4030-9d28-9f9e312e941a","Type":"ContainerDied","Data":"ceff2cdd5c0f364452f469adc2b3d7e5a125c153a40e420664446c4a3e14f10c"} Jan 31 09:18:27 crc kubenswrapper[4783]: I0131 09:18:27.448707 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceff2cdd5c0f364452f469adc2b3d7e5a125c153a40e420664446c4a3e14f10c" Jan 31 09:18:27 crc kubenswrapper[4783]: I0131 09:18:27.448780 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-22mw9" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.604582 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xkw48" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.607571 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0fd3-account-create-update-9x4ll" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.628342 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h5m5k" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.633555 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-496c-account-create-update-6jkmm" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.671948 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6tzx\" (UniqueName: \"kubernetes.io/projected/181145e5-44a7-448b-b5a3-f6f03f325c01-kube-api-access-n6tzx\") pod \"181145e5-44a7-448b-b5a3-f6f03f325c01\" (UID: \"181145e5-44a7-448b-b5a3-f6f03f325c01\") " Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.672397 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181145e5-44a7-448b-b5a3-f6f03f325c01-operator-scripts\") pod \"181145e5-44a7-448b-b5a3-f6f03f325c01\" (UID: \"181145e5-44a7-448b-b5a3-f6f03f325c01\") " Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.672429 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4m7f\" (UniqueName: \"kubernetes.io/projected/a6502457-d446-4d87-ab1b-aae6fd53f95a-kube-api-access-s4m7f\") pod \"a6502457-d446-4d87-ab1b-aae6fd53f95a\" (UID: \"a6502457-d446-4d87-ab1b-aae6fd53f95a\") " Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.672573 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6502457-d446-4d87-ab1b-aae6fd53f95a-operator-scripts\") pod \"a6502457-d446-4d87-ab1b-aae6fd53f95a\" (UID: \"a6502457-d446-4d87-ab1b-aae6fd53f95a\") " Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.673685 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/181145e5-44a7-448b-b5a3-f6f03f325c01-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "181145e5-44a7-448b-b5a3-f6f03f325c01" (UID: "181145e5-44a7-448b-b5a3-f6f03f325c01"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.673828 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6502457-d446-4d87-ab1b-aae6fd53f95a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6502457-d446-4d87-ab1b-aae6fd53f95a" (UID: "a6502457-d446-4d87-ab1b-aae6fd53f95a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.678100 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6502457-d446-4d87-ab1b-aae6fd53f95a-kube-api-access-s4m7f" (OuterVolumeSpecName: "kube-api-access-s4m7f") pod "a6502457-d446-4d87-ab1b-aae6fd53f95a" (UID: "a6502457-d446-4d87-ab1b-aae6fd53f95a"). InnerVolumeSpecName "kube-api-access-s4m7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.679554 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/181145e5-44a7-448b-b5a3-f6f03f325c01-kube-api-access-n6tzx" (OuterVolumeSpecName: "kube-api-access-n6tzx") pod "181145e5-44a7-448b-b5a3-f6f03f325c01" (UID: "181145e5-44a7-448b-b5a3-f6f03f325c01"). InnerVolumeSpecName "kube-api-access-n6tzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.774515 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12869632-705f-4514-b2b1-d2eb29bc986b-operator-scripts\") pod \"12869632-705f-4514-b2b1-d2eb29bc986b\" (UID: \"12869632-705f-4514-b2b1-d2eb29bc986b\") " Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.774618 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smzds\" (UniqueName: \"kubernetes.io/projected/2de0e2db-9b04-476b-a09f-7cf7415ba0e7-kube-api-access-smzds\") pod \"2de0e2db-9b04-476b-a09f-7cf7415ba0e7\" (UID: \"2de0e2db-9b04-476b-a09f-7cf7415ba0e7\") " Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.774641 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kdfc\" (UniqueName: \"kubernetes.io/projected/12869632-705f-4514-b2b1-d2eb29bc986b-kube-api-access-6kdfc\") pod \"12869632-705f-4514-b2b1-d2eb29bc986b\" (UID: \"12869632-705f-4514-b2b1-d2eb29bc986b\") " Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.775197 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12869632-705f-4514-b2b1-d2eb29bc986b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "12869632-705f-4514-b2b1-d2eb29bc986b" (UID: "12869632-705f-4514-b2b1-d2eb29bc986b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.775696 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de0e2db-9b04-476b-a09f-7cf7415ba0e7-operator-scripts\") pod \"2de0e2db-9b04-476b-a09f-7cf7415ba0e7\" (UID: \"2de0e2db-9b04-476b-a09f-7cf7415ba0e7\") " Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.776189 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de0e2db-9b04-476b-a09f-7cf7415ba0e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2de0e2db-9b04-476b-a09f-7cf7415ba0e7" (UID: "2de0e2db-9b04-476b-a09f-7cf7415ba0e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.777229 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de0e2db-9b04-476b-a09f-7cf7415ba0e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.777569 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6tzx\" (UniqueName: \"kubernetes.io/projected/181145e5-44a7-448b-b5a3-f6f03f325c01-kube-api-access-n6tzx\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.777700 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181145e5-44a7-448b-b5a3-f6f03f325c01-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.777755 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4m7f\" (UniqueName: \"kubernetes.io/projected/a6502457-d446-4d87-ab1b-aae6fd53f95a-kube-api-access-s4m7f\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.777817 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6502457-d446-4d87-ab1b-aae6fd53f95a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.777658 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12869632-705f-4514-b2b1-d2eb29bc986b-kube-api-access-6kdfc" (OuterVolumeSpecName: "kube-api-access-6kdfc") pod "12869632-705f-4514-b2b1-d2eb29bc986b" (UID: "12869632-705f-4514-b2b1-d2eb29bc986b"). InnerVolumeSpecName "kube-api-access-6kdfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.777880 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12869632-705f-4514-b2b1-d2eb29bc986b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.778526 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de0e2db-9b04-476b-a09f-7cf7415ba0e7-kube-api-access-smzds" (OuterVolumeSpecName: "kube-api-access-smzds") pod "2de0e2db-9b04-476b-a09f-7cf7415ba0e7" (UID: "2de0e2db-9b04-476b-a09f-7cf7415ba0e7"). InnerVolumeSpecName "kube-api-access-smzds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.880071 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kdfc\" (UniqueName: \"kubernetes.io/projected/12869632-705f-4514-b2b1-d2eb29bc986b-kube-api-access-6kdfc\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:29 crc kubenswrapper[4783]: I0131 09:18:29.880114 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smzds\" (UniqueName: \"kubernetes.io/projected/2de0e2db-9b04-476b-a09f-7cf7415ba0e7-kube-api-access-smzds\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:30 crc kubenswrapper[4783]: I0131 09:18:30.479195 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h5m5k" event={"ID":"2de0e2db-9b04-476b-a09f-7cf7415ba0e7","Type":"ContainerDied","Data":"7f73012b6f8e6fb1f767d4ac4896863eaa2aa2fb75a0725e21fb9f7e5835bf25"} Jan 31 09:18:30 crc kubenswrapper[4783]: I0131 09:18:30.479341 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f73012b6f8e6fb1f767d4ac4896863eaa2aa2fb75a0725e21fb9f7e5835bf25" Jan 31 09:18:30 crc kubenswrapper[4783]: I0131 09:18:30.479454 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h5m5k" Jan 31 09:18:30 crc kubenswrapper[4783]: I0131 09:18:30.485416 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p9k2r" event={"ID":"4d851f24-0013-417f-9734-d28d22b27057","Type":"ContainerStarted","Data":"a17c1c50244939cae88c79b5c8a17429354156e7d7fa02fd3c273caaa04394ca"} Jan 31 09:18:30 crc kubenswrapper[4783]: I0131 09:18:30.489696 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xkw48" Jan 31 09:18:30 crc kubenswrapper[4783]: I0131 09:18:30.489857 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xkw48" event={"ID":"a6502457-d446-4d87-ab1b-aae6fd53f95a","Type":"ContainerDied","Data":"c3fc6d30da16c44e570253e7b37740152a44663508151f10e7beba06a81682fb"} Jan 31 09:18:30 crc kubenswrapper[4783]: I0131 09:18:30.489883 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3fc6d30da16c44e570253e7b37740152a44663508151f10e7beba06a81682fb" Jan 31 09:18:30 crc kubenswrapper[4783]: I0131 09:18:30.491294 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0fd3-account-create-update-9x4ll" Jan 31 09:18:30 crc kubenswrapper[4783]: I0131 09:18:30.491304 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0fd3-account-create-update-9x4ll" event={"ID":"181145e5-44a7-448b-b5a3-f6f03f325c01","Type":"ContainerDied","Data":"847ba83f5e915200b52dcae111ad50bf4bd6df02f8aee1518c4aa97870c3c120"} Jan 31 09:18:30 crc kubenswrapper[4783]: I0131 09:18:30.491346 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="847ba83f5e915200b52dcae111ad50bf4bd6df02f8aee1518c4aa97870c3c120" Jan 31 09:18:30 crc kubenswrapper[4783]: I0131 09:18:30.492818 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-496c-account-create-update-6jkmm" event={"ID":"12869632-705f-4514-b2b1-d2eb29bc986b","Type":"ContainerDied","Data":"4ec080f0525b1605fbd6f107bf1e2887e92a7fc80c74d7375cf0f9515d66b469"} Jan 31 09:18:30 crc kubenswrapper[4783]: I0131 09:18:30.492840 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ec080f0525b1605fbd6f107bf1e2887e92a7fc80c74d7375cf0f9515d66b469" Jan 31 09:18:30 crc kubenswrapper[4783]: I0131 09:18:30.492891 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-496c-account-create-update-6jkmm" Jan 31 09:18:30 crc kubenswrapper[4783]: I0131 09:18:30.648905 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-p9k2r" podStartSLOduration=2.837683791 podStartE2EDuration="6.648882215s" podCreationTimestamp="2026-01-31 09:18:24 +0000 UTC" firstStartedPulling="2026-01-31 09:18:25.682450741 +0000 UTC m=+816.351134208" lastFinishedPulling="2026-01-31 09:18:29.493649164 +0000 UTC m=+820.162332632" observedRunningTime="2026-01-31 09:18:30.498205794 +0000 UTC m=+821.166889262" watchObservedRunningTime="2026-01-31 09:18:30.648882215 +0000 UTC m=+821.317565684" Jan 31 09:18:31 crc kubenswrapper[4783]: I0131 09:18:31.501604 4783 generic.go:334] "Generic (PLEG): container finished" podID="4d851f24-0013-417f-9734-d28d22b27057" containerID="a17c1c50244939cae88c79b5c8a17429354156e7d7fa02fd3c273caaa04394ca" exitCode=0 Jan 31 09:18:31 crc kubenswrapper[4783]: I0131 09:18:31.501669 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p9k2r" event={"ID":"4d851f24-0013-417f-9734-d28d22b27057","Type":"ContainerDied","Data":"a17c1c50244939cae88c79b5c8a17429354156e7d7fa02fd3c273caaa04394ca"} Jan 31 09:18:32 crc kubenswrapper[4783]: I0131 09:18:32.750566 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p9k2r" Jan 31 09:18:32 crc kubenswrapper[4783]: I0131 09:18:32.822844 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d851f24-0013-417f-9734-d28d22b27057-config-data\") pod \"4d851f24-0013-417f-9734-d28d22b27057\" (UID: \"4d851f24-0013-417f-9734-d28d22b27057\") " Jan 31 09:18:32 crc kubenswrapper[4783]: I0131 09:18:32.822997 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks5x7\" (UniqueName: \"kubernetes.io/projected/4d851f24-0013-417f-9734-d28d22b27057-kube-api-access-ks5x7\") pod \"4d851f24-0013-417f-9734-d28d22b27057\" (UID: \"4d851f24-0013-417f-9734-d28d22b27057\") " Jan 31 09:18:32 crc kubenswrapper[4783]: I0131 09:18:32.823208 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d851f24-0013-417f-9734-d28d22b27057-combined-ca-bundle\") pod \"4d851f24-0013-417f-9734-d28d22b27057\" (UID: \"4d851f24-0013-417f-9734-d28d22b27057\") " Jan 31 09:18:32 crc kubenswrapper[4783]: I0131 09:18:32.828787 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d851f24-0013-417f-9734-d28d22b27057-kube-api-access-ks5x7" (OuterVolumeSpecName: "kube-api-access-ks5x7") pod "4d851f24-0013-417f-9734-d28d22b27057" (UID: "4d851f24-0013-417f-9734-d28d22b27057"). InnerVolumeSpecName "kube-api-access-ks5x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:32 crc kubenswrapper[4783]: I0131 09:18:32.845663 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d851f24-0013-417f-9734-d28d22b27057-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d851f24-0013-417f-9734-d28d22b27057" (UID: "4d851f24-0013-417f-9734-d28d22b27057"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:32 crc kubenswrapper[4783]: I0131 09:18:32.859939 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d851f24-0013-417f-9734-d28d22b27057-config-data" (OuterVolumeSpecName: "config-data") pod "4d851f24-0013-417f-9734-d28d22b27057" (UID: "4d851f24-0013-417f-9734-d28d22b27057"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:32 crc kubenswrapper[4783]: I0131 09:18:32.925451 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d851f24-0013-417f-9734-d28d22b27057-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:32 crc kubenswrapper[4783]: I0131 09:18:32.925477 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d851f24-0013-417f-9734-d28d22b27057-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:32 crc kubenswrapper[4783]: I0131 09:18:32.925488 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks5x7\" (UniqueName: \"kubernetes.io/projected/4d851f24-0013-417f-9734-d28d22b27057-kube-api-access-ks5x7\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.519084 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p9k2r" event={"ID":"4d851f24-0013-417f-9734-d28d22b27057","Type":"ContainerDied","Data":"67ea5d053ef03c37b9a55d7b25c0bdcc1f62183453101cf6c1054ce4b9876046"} Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.519129 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p9k2r" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.519136 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67ea5d053ef03c37b9a55d7b25c0bdcc1f62183453101cf6c1054ce4b9876046" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.958707 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-spzmd"] Jan 31 09:18:33 crc kubenswrapper[4783]: E0131 09:18:33.959281 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181145e5-44a7-448b-b5a3-f6f03f325c01" containerName="mariadb-account-create-update" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.959295 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="181145e5-44a7-448b-b5a3-f6f03f325c01" containerName="mariadb-account-create-update" Jan 31 09:18:33 crc kubenswrapper[4783]: E0131 09:18:33.959308 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de0e2db-9b04-476b-a09f-7cf7415ba0e7" containerName="mariadb-database-create" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.959314 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de0e2db-9b04-476b-a09f-7cf7415ba0e7" containerName="mariadb-database-create" Jan 31 09:18:33 crc kubenswrapper[4783]: E0131 09:18:33.959322 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac471b1-537d-4498-90ea-0c8ccb699ae8" containerName="mariadb-account-create-update" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.959328 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac471b1-537d-4498-90ea-0c8ccb699ae8" containerName="mariadb-account-create-update" Jan 31 09:18:33 crc kubenswrapper[4783]: E0131 09:18:33.959339 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37863ae7-16e3-4030-9d28-9f9e312e941a" containerName="mariadb-database-create" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.959345 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="37863ae7-16e3-4030-9d28-9f9e312e941a" containerName="mariadb-database-create" Jan 31 09:18:33 crc kubenswrapper[4783]: E0131 09:18:33.959365 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d851f24-0013-417f-9734-d28d22b27057" containerName="keystone-db-sync" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.959370 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d851f24-0013-417f-9734-d28d22b27057" containerName="keystone-db-sync" Jan 31 09:18:33 crc kubenswrapper[4783]: E0131 09:18:33.959375 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12869632-705f-4514-b2b1-d2eb29bc986b" containerName="mariadb-account-create-update" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.959387 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="12869632-705f-4514-b2b1-d2eb29bc986b" containerName="mariadb-account-create-update" Jan 31 09:18:33 crc kubenswrapper[4783]: E0131 09:18:33.959397 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6502457-d446-4d87-ab1b-aae6fd53f95a" containerName="mariadb-database-create" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.959403 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6502457-d446-4d87-ab1b-aae6fd53f95a" containerName="mariadb-database-create" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.959526 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="37863ae7-16e3-4030-9d28-9f9e312e941a" containerName="mariadb-database-create" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.959539 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac471b1-537d-4498-90ea-0c8ccb699ae8" containerName="mariadb-account-create-update" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.959550 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d851f24-0013-417f-9734-d28d22b27057" containerName="keystone-db-sync" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.959558 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="181145e5-44a7-448b-b5a3-f6f03f325c01" containerName="mariadb-account-create-update" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.959570 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="12869632-705f-4514-b2b1-d2eb29bc986b" containerName="mariadb-account-create-update" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.959583 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de0e2db-9b04-476b-a09f-7cf7415ba0e7" containerName="mariadb-database-create" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.959590 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6502457-d446-4d87-ab1b-aae6fd53f95a" containerName="mariadb-database-create" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.960061 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.962025 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.962132 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wzwfs" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.962118 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.962128 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.965469 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.994897 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-zq2kc"] Jan 31 09:18:33 crc kubenswrapper[4783]: I0131 09:18:33.996109 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.014568 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-zq2kc"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.040239 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qlhp\" (UniqueName: \"kubernetes.io/projected/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-kube-api-access-7qlhp\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.040283 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-credential-keys\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.040425 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-config-data\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.040510 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-combined-ca-bundle\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.040657 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-scripts\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.040692 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-fernet-keys\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.048326 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-spzmd"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.141684 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-fernet-keys\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.141778 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-dns-swift-storage-0\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.141812 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qlhp\" (UniqueName: \"kubernetes.io/projected/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-kube-api-access-7qlhp\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.141831 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-credential-keys\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.141853 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-dns-svc\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.141906 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-config-data\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.141931 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-ovsdbserver-sb\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.141958 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-combined-ca-bundle\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.141975 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcf2z\" (UniqueName: \"kubernetes.io/projected/68667699-5af2-4b17-b31c-b33dfb76798e-kube-api-access-xcf2z\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.141997 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-ovsdbserver-nb\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.142023 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-config\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.142058 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-scripts\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.147271 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-credential-keys\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.147408 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-combined-ca-bundle\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.150758 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-fernet-keys\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.158426 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-scripts\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.162226 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5989bc564f-6q4l6"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.163529 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.171557 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.171778 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-75z5l" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.171785 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qlhp\" (UniqueName: \"kubernetes.io/projected/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-kube-api-access-7qlhp\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.171925 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.171935 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.173105 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-config-data\") pod \"keystone-bootstrap-spzmd\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.192815 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5989bc564f-6q4l6"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.235871 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-zq2kc"] Jan 31 09:18:34 crc kubenswrapper[4783]: E0131 09:18:34.236603 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-xcf2z ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" podUID="68667699-5af2-4b17-b31c-b33dfb76798e" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.245151 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxpfb\" (UniqueName: \"kubernetes.io/projected/89312075-7597-4743-b92c-58411b26f1ec-kube-api-access-mxpfb\") pod \"horizon-5989bc564f-6q4l6\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.245210 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89312075-7597-4743-b92c-58411b26f1ec-logs\") pod \"horizon-5989bc564f-6q4l6\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.245280 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-dns-swift-storage-0\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.245305 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89312075-7597-4743-b92c-58411b26f1ec-config-data\") pod \"horizon-5989bc564f-6q4l6\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.245326 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-dns-svc\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.245359 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-ovsdbserver-sb\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.245396 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcf2z\" (UniqueName: \"kubernetes.io/projected/68667699-5af2-4b17-b31c-b33dfb76798e-kube-api-access-xcf2z\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.245422 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-ovsdbserver-nb\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.245439 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89312075-7597-4743-b92c-58411b26f1ec-horizon-secret-key\") pod \"horizon-5989bc564f-6q4l6\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.245465 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89312075-7597-4743-b92c-58411b26f1ec-scripts\") pod \"horizon-5989bc564f-6q4l6\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.245485 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-config\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.246370 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-config\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.246399 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-ovsdbserver-sb\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.246449 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-c5vpg"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.246631 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-ovsdbserver-nb\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.247335 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-dns-swift-storage-0\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.247410 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-dns-svc\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.247443 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.252485 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-lj2nd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.252692 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.252873 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.282574 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-c5vpg"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.286455 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.287079 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcf2z\" (UniqueName: \"kubernetes.io/projected/68667699-5af2-4b17-b31c-b33dfb76798e-kube-api-access-xcf2z\") pod \"dnsmasq-dns-54b4bb76d5-zq2kc\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.300398 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-tlndp"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.355302 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.365278 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89312075-7597-4743-b92c-58411b26f1ec-scripts\") pod \"horizon-5989bc564f-6q4l6\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.366371 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89312075-7597-4743-b92c-58411b26f1ec-scripts\") pod \"horizon-5989bc564f-6q4l6\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.366435 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-combined-ca-bundle\") pod \"placement-db-sync-c5vpg\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.366457 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-scripts\") pod \"placement-db-sync-c5vpg\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.366512 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-config-data\") pod \"placement-db-sync-c5vpg\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.366536 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxpfb\" (UniqueName: \"kubernetes.io/projected/89312075-7597-4743-b92c-58411b26f1ec-kube-api-access-mxpfb\") pod \"horizon-5989bc564f-6q4l6\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.366552 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89312075-7597-4743-b92c-58411b26f1ec-logs\") pod \"horizon-5989bc564f-6q4l6\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.366579 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89312075-7597-4743-b92c-58411b26f1ec-config-data\") pod \"horizon-5989bc564f-6q4l6\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.366626 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be3abc10-848a-448b-a2a2-df825e99a23f-logs\") pod \"placement-db-sync-c5vpg\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.366642 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqwz5\" (UniqueName: \"kubernetes.io/projected/be3abc10-848a-448b-a2a2-df825e99a23f-kube-api-access-dqwz5\") pod \"placement-db-sync-c5vpg\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.366666 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89312075-7597-4743-b92c-58411b26f1ec-horizon-secret-key\") pod \"horizon-5989bc564f-6q4l6\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.372794 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89312075-7597-4743-b92c-58411b26f1ec-logs\") pod \"horizon-5989bc564f-6q4l6\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.377370 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89312075-7597-4743-b92c-58411b26f1ec-config-data\") pod \"horizon-5989bc564f-6q4l6\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.382628 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-tlndp"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.384816 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89312075-7597-4743-b92c-58411b26f1ec-horizon-secret-key\") pod \"horizon-5989bc564f-6q4l6\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.407431 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-cc66f58c7-79z2q"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.411280 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.412304 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxpfb\" (UniqueName: \"kubernetes.io/projected/89312075-7597-4743-b92c-58411b26f1ec-kube-api-access-mxpfb\") pod \"horizon-5989bc564f-6q4l6\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.422140 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-6c24v"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.423352 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.434436 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.434853 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-gb2rj" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.434970 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.445104 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-cc66f58c7-79z2q"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.457651 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.464404 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6c24v"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.464541 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.467631 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.467740 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.468732 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.467861 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.470489 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-cbsh2" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.477212 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be3abc10-848a-448b-a2a2-df825e99a23f-logs\") pod \"placement-db-sync-c5vpg\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.477312 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqwz5\" (UniqueName: \"kubernetes.io/projected/be3abc10-848a-448b-a2a2-df825e99a23f-kube-api-access-dqwz5\") pod \"placement-db-sync-c5vpg\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.477465 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.477563 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-combined-ca-bundle\") pod \"placement-db-sync-c5vpg\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.477638 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-scripts\") pod \"placement-db-sync-c5vpg\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.470930 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.477708 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-config\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.477866 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxznq\" (UniqueName: \"kubernetes.io/projected/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-kube-api-access-sxznq\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.477658 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be3abc10-848a-448b-a2a2-df825e99a23f-logs\") pod \"placement-db-sync-c5vpg\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.477981 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.478120 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.478248 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-config-data\") pod \"placement-db-sync-c5vpg\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.489995 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-scripts\") pod \"placement-db-sync-c5vpg\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.490057 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-combined-ca-bundle\") pod \"placement-db-sync-c5vpg\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.492478 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.495497 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.496716 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-config-data\") pod \"placement-db-sync-c5vpg\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.499638 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.499862 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.500787 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.508109 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqwz5\" (UniqueName: \"kubernetes.io/projected/be3abc10-848a-448b-a2a2-df825e99a23f-kube-api-access-dqwz5\") pod \"placement-db-sync-c5vpg\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.519058 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.532968 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.533859 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.537986 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.537993 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.539303 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.566541 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.580521 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/349997ee-6053-4d85-8eae-1d4adf3b347e-config-data\") pod \"horizon-cc66f58c7-79z2q\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.580568 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.580597 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-config-data\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.580614 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-scripts\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.580634 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/349997ee-6053-4d85-8eae-1d4adf3b347e-scripts\") pod \"horizon-cc66f58c7-79z2q\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.580662 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9nd6\" (UniqueName: \"kubernetes.io/projected/e52e03f3-fa41-487d-affe-89222406f4bb-kube-api-access-k9nd6\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.580682 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2t96\" (UniqueName: \"kubernetes.io/projected/349997ee-6053-4d85-8eae-1d4adf3b347e-kube-api-access-d2t96\") pod \"horizon-cc66f58c7-79z2q\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.580705 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.580731 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6275a243-2cfc-4f77-a4b5-40a697e309d9-etc-machine-id\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.580757 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-db-sync-config-data\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.580774 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.580793 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-combined-ca-bundle\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.580811 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e52e03f3-fa41-487d-affe-89222406f4bb-logs\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.580830 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hss89\" (UniqueName: \"kubernetes.io/projected/6275a243-2cfc-4f77-a4b5-40a697e309d9-kube-api-access-hss89\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.580898 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-config\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.581016 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/349997ee-6053-4d85-8eae-1d4adf3b347e-horizon-secret-key\") pod \"horizon-cc66f58c7-79z2q\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.581724 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-config-data\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.581750 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e52e03f3-fa41-487d-affe-89222406f4bb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.581776 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxznq\" (UniqueName: \"kubernetes.io/projected/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-kube-api-access-sxznq\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.581806 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.581824 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/349997ee-6053-4d85-8eae-1d4adf3b347e-logs\") pod \"horizon-cc66f58c7-79z2q\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.581855 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-scripts\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.581876 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.581901 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.581995 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.583280 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-c5vpg" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.584118 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.584145 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.584208 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.584323 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.584567 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-config\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.591314 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.604202 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxznq\" (UniqueName: \"kubernetes.io/projected/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-kube-api-access-sxznq\") pod \"dnsmasq-dns-5dc4fcdbc-tlndp\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.665652 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-k2pw7"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.667529 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k2pw7" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.671452 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-sqzmk" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.672662 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.676057 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-k2pw7"] Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.689888 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-ovsdbserver-nb\") pod \"68667699-5af2-4b17-b31c-b33dfb76798e\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.690265 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-config\") pod \"68667699-5af2-4b17-b31c-b33dfb76798e\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.690361 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-ovsdbserver-sb\") pod \"68667699-5af2-4b17-b31c-b33dfb76798e\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.690409 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-dns-swift-storage-0\") pod \"68667699-5af2-4b17-b31c-b33dfb76798e\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.690458 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcf2z\" (UniqueName: \"kubernetes.io/projected/68667699-5af2-4b17-b31c-b33dfb76798e-kube-api-access-xcf2z\") pod \"68667699-5af2-4b17-b31c-b33dfb76798e\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.690507 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-dns-svc\") pod \"68667699-5af2-4b17-b31c-b33dfb76798e\" (UID: \"68667699-5af2-4b17-b31c-b33dfb76798e\") " Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.690789 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-config" (OuterVolumeSpecName: "config") pod "68667699-5af2-4b17-b31c-b33dfb76798e" (UID: "68667699-5af2-4b17-b31c-b33dfb76798e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.690801 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.690836 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.690862 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-scripts\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.690891 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.690908 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81c94de6-2854-4f8c-9d50-2467632fa290-logs\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.690946 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-scripts\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.690984 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691012 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvxzp\" (UniqueName: \"kubernetes.io/projected/81c94de6-2854-4f8c-9d50-2467632fa290-kube-api-access-rvxzp\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691045 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691073 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/349997ee-6053-4d85-8eae-1d4adf3b347e-config-data\") pod \"horizon-cc66f58c7-79z2q\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691099 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-config-data\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691113 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-scripts\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691128 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/349997ee-6053-4d85-8eae-1d4adf3b347e-scripts\") pod \"horizon-cc66f58c7-79z2q\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691154 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9nd6\" (UniqueName: \"kubernetes.io/projected/e52e03f3-fa41-487d-affe-89222406f4bb-kube-api-access-k9nd6\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691192 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2t96\" (UniqueName: \"kubernetes.io/projected/349997ee-6053-4d85-8eae-1d4adf3b347e-kube-api-access-d2t96\") pod \"horizon-cc66f58c7-79z2q\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691222 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8h26\" (UniqueName: \"kubernetes.io/projected/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-kube-api-access-f8h26\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691241 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691260 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81c94de6-2854-4f8c-9d50-2467632fa290-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691290 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6275a243-2cfc-4f77-a4b5-40a697e309d9-etc-machine-id\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691312 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691328 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-run-httpd\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691349 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-db-sync-config-data\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691370 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691396 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e52e03f3-fa41-487d-affe-89222406f4bb-logs\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691414 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-combined-ca-bundle\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691435 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hss89\" (UniqueName: \"kubernetes.io/projected/6275a243-2cfc-4f77-a4b5-40a697e309d9-kube-api-access-hss89\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691452 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-log-httpd\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691480 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/349997ee-6053-4d85-8eae-1d4adf3b347e-horizon-secret-key\") pod \"horizon-cc66f58c7-79z2q\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691501 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691531 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-config-data\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691548 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-config-data\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691562 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e52e03f3-fa41-487d-affe-89222406f4bb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691587 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691610 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/349997ee-6053-4d85-8eae-1d4adf3b347e-logs\") pod \"horizon-cc66f58c7-79z2q\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.691655 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.692024 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/349997ee-6053-4d85-8eae-1d4adf3b347e-logs\") pod \"horizon-cc66f58c7-79z2q\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.693178 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "68667699-5af2-4b17-b31c-b33dfb76798e" (UID: "68667699-5af2-4b17-b31c-b33dfb76798e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.693662 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "68667699-5af2-4b17-b31c-b33dfb76798e" (UID: "68667699-5af2-4b17-b31c-b33dfb76798e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.695559 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "68667699-5af2-4b17-b31c-b33dfb76798e" (UID: "68667699-5af2-4b17-b31c-b33dfb76798e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.698129 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.699402 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "68667699-5af2-4b17-b31c-b33dfb76798e" (UID: "68667699-5af2-4b17-b31c-b33dfb76798e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.702332 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/349997ee-6053-4d85-8eae-1d4adf3b347e-horizon-secret-key\") pod \"horizon-cc66f58c7-79z2q\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.702800 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6275a243-2cfc-4f77-a4b5-40a697e309d9-etc-machine-id\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.703606 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-scripts\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.716074 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/349997ee-6053-4d85-8eae-1d4adf3b347e-scripts\") pod \"horizon-cc66f58c7-79z2q\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.717488 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-combined-ca-bundle\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.718394 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68667699-5af2-4b17-b31c-b33dfb76798e-kube-api-access-xcf2z" (OuterVolumeSpecName: "kube-api-access-xcf2z") pod "68667699-5af2-4b17-b31c-b33dfb76798e" (UID: "68667699-5af2-4b17-b31c-b33dfb76798e"). InnerVolumeSpecName "kube-api-access-xcf2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.718608 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/349997ee-6053-4d85-8eae-1d4adf3b347e-config-data\") pod \"horizon-cc66f58c7-79z2q\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.720497 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2t96\" (UniqueName: \"kubernetes.io/projected/349997ee-6053-4d85-8eae-1d4adf3b347e-kube-api-access-d2t96\") pod \"horizon-cc66f58c7-79z2q\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.721425 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hss89\" (UniqueName: \"kubernetes.io/projected/6275a243-2cfc-4f77-a4b5-40a697e309d9-kube-api-access-hss89\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.723235 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-config-data\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.724462 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-db-sync-config-data\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.725934 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.726149 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e52e03f3-fa41-487d-affe-89222406f4bb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.726232 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.728458 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e52e03f3-fa41-487d-affe-89222406f4bb-logs\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.729810 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9nd6\" (UniqueName: \"kubernetes.io/projected/e52e03f3-fa41-487d-affe-89222406f4bb-kube-api-access-k9nd6\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.741175 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-config-data\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.748315 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.752444 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-scripts\") pod \"cinder-db-sync-6c24v\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.757497 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.759096 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.783367 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6c24v" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.794517 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.795576 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.795608 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-run-httpd\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.795638 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.795662 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-log-httpd\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.795689 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.795712 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-config-data\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.795733 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.798912 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.800054 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.800364 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.800449 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81c94de6-2854-4f8c-9d50-2467632fa290-logs\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.800628 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-scripts\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.800984 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-combined-ca-bundle\") pod \"barbican-db-sync-k2pw7\" (UID: \"fd0c0937-6461-4221-bbd8-3c7e37bbff9d\") " pod="openstack/barbican-db-sync-k2pw7" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.801074 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.801135 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-db-sync-config-data\") pod \"barbican-db-sync-k2pw7\" (UID: \"fd0c0937-6461-4221-bbd8-3c7e37bbff9d\") " pod="openstack/barbican-db-sync-k2pw7" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.801255 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sqcn\" (UniqueName: \"kubernetes.io/projected/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-kube-api-access-9sqcn\") pod \"barbican-db-sync-k2pw7\" (UID: \"fd0c0937-6461-4221-bbd8-3c7e37bbff9d\") " pod="openstack/barbican-db-sync-k2pw7" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.801317 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvxzp\" (UniqueName: \"kubernetes.io/projected/81c94de6-2854-4f8c-9d50-2467632fa290-kube-api-access-rvxzp\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.801430 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.801505 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8h26\" (UniqueName: \"kubernetes.io/projected/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-kube-api-access-f8h26\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.801568 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81c94de6-2854-4f8c-9d50-2467632fa290-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.801658 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcf2z\" (UniqueName: \"kubernetes.io/projected/68667699-5af2-4b17-b31c-b33dfb76798e-kube-api-access-xcf2z\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.801710 4783 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.801766 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.801815 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.801860 4783 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68667699-5af2-4b17-b31c-b33dfb76798e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.800222 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-log-httpd\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.801657 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-run-httpd\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.800309 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.800947 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81c94de6-2854-4f8c-9d50-2467632fa290-logs\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.802469 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81c94de6-2854-4f8c-9d50-2467632fa290-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.807913 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-scripts\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.810202 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.814719 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.817035 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.817844 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvxzp\" (UniqueName: \"kubernetes.io/projected/81c94de6-2854-4f8c-9d50-2467632fa290-kube-api-access-rvxzp\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.817939 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-config-data\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.820069 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.825838 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8h26\" (UniqueName: \"kubernetes.io/projected/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-kube-api-access-f8h26\") pod \"ceilometer-0\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " pod="openstack/ceilometer-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.848944 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.909375 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-combined-ca-bundle\") pod \"barbican-db-sync-k2pw7\" (UID: \"fd0c0937-6461-4221-bbd8-3c7e37bbff9d\") " pod="openstack/barbican-db-sync-k2pw7" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.909433 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-db-sync-config-data\") pod \"barbican-db-sync-k2pw7\" (UID: \"fd0c0937-6461-4221-bbd8-3c7e37bbff9d\") " pod="openstack/barbican-db-sync-k2pw7" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.909459 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sqcn\" (UniqueName: \"kubernetes.io/projected/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-kube-api-access-9sqcn\") pod \"barbican-db-sync-k2pw7\" (UID: \"fd0c0937-6461-4221-bbd8-3c7e37bbff9d\") " pod="openstack/barbican-db-sync-k2pw7" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.914112 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-combined-ca-bundle\") pod \"barbican-db-sync-k2pw7\" (UID: \"fd0c0937-6461-4221-bbd8-3c7e37bbff9d\") " pod="openstack/barbican-db-sync-k2pw7" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.920766 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-db-sync-config-data\") pod \"barbican-db-sync-k2pw7\" (UID: \"fd0c0937-6461-4221-bbd8-3c7e37bbff9d\") " pod="openstack/barbican-db-sync-k2pw7" Jan 31 09:18:34 crc kubenswrapper[4783]: I0131 09:18:34.944091 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sqcn\" (UniqueName: \"kubernetes.io/projected/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-kube-api-access-9sqcn\") pod \"barbican-db-sync-k2pw7\" (UID: \"fd0c0937-6461-4221-bbd8-3c7e37bbff9d\") " pod="openstack/barbican-db-sync-k2pw7" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.004404 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-7mds6"] Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.006278 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7mds6" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.027895 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k2pw7" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.039156 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.039381 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.044909 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-d6l9f" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.050225 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-spzmd"] Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.122480 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-config\") pod \"neutron-db-sync-7mds6\" (UID: \"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9\") " pod="openstack/neutron-db-sync-7mds6" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.122818 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6qqd\" (UniqueName: \"kubernetes.io/projected/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-kube-api-access-d6qqd\") pod \"neutron-db-sync-7mds6\" (UID: \"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9\") " pod="openstack/neutron-db-sync-7mds6" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.122885 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-combined-ca-bundle\") pod \"neutron-db-sync-7mds6\" (UID: \"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9\") " pod="openstack/neutron-db-sync-7mds6" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.123193 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.125406 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7mds6"] Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.153341 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.171200 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5989bc564f-6q4l6"] Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.224816 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6qqd\" (UniqueName: \"kubernetes.io/projected/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-kube-api-access-d6qqd\") pod \"neutron-db-sync-7mds6\" (UID: \"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9\") " pod="openstack/neutron-db-sync-7mds6" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.225116 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-combined-ca-bundle\") pod \"neutron-db-sync-7mds6\" (UID: \"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9\") " pod="openstack/neutron-db-sync-7mds6" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.225271 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-config\") pod \"neutron-db-sync-7mds6\" (UID: \"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9\") " pod="openstack/neutron-db-sync-7mds6" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.231520 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-combined-ca-bundle\") pod \"neutron-db-sync-7mds6\" (UID: \"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9\") " pod="openstack/neutron-db-sync-7mds6" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.231629 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-config\") pod \"neutron-db-sync-7mds6\" (UID: \"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9\") " pod="openstack/neutron-db-sync-7mds6" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.255475 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6qqd\" (UniqueName: \"kubernetes.io/projected/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-kube-api-access-d6qqd\") pod \"neutron-db-sync-7mds6\" (UID: \"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9\") " pod="openstack/neutron-db-sync-7mds6" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.340505 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7mds6" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.371679 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-c5vpg"] Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.511323 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-tlndp"] Jan 31 09:18:35 crc kubenswrapper[4783]: W0131 09:18:35.513857 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode28f3ed6_b41b_46f2_822a_0a7a1c79e512.slice/crio-15e54d73de0e6da76d7852766a4374ead716adb651411288c63c4d30fb7ef116 WatchSource:0}: Error finding container 15e54d73de0e6da76d7852766a4374ead716adb651411288c63c4d30fb7ef116: Status 404 returned error can't find the container with id 15e54d73de0e6da76d7852766a4374ead716adb651411288c63c4d30fb7ef116 Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.554682 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-c5vpg" event={"ID":"be3abc10-848a-448b-a2a2-df825e99a23f","Type":"ContainerStarted","Data":"492f2e8a4078eb9777859fe393372cb93070e5f17568cf3d7d3918b8ae88c4a9"} Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.556130 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" event={"ID":"e28f3ed6-b41b-46f2-822a-0a7a1c79e512","Type":"ContainerStarted","Data":"15e54d73de0e6da76d7852766a4374ead716adb651411288c63c4d30fb7ef116"} Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.557633 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5989bc564f-6q4l6" event={"ID":"89312075-7597-4743-b92c-58411b26f1ec","Type":"ContainerStarted","Data":"7244e6cc3caf7fc0265490fefbf19c0b20e802ba037473ce8073f3ee29435ad3"} Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.559079 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b4bb76d5-zq2kc" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.559395 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-spzmd" event={"ID":"beacf9d0-1587-4cb5-b9e9-d284ff8b8288","Type":"ContainerStarted","Data":"b29a5cc127c8a868aa3a95c1ff709c7d0c5b580c5d71197fdd3e747de1a870c8"} Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.559425 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-spzmd" event={"ID":"beacf9d0-1587-4cb5-b9e9-d284ff8b8288","Type":"ContainerStarted","Data":"1a311e43b33858bffb71304e30483b281b8af7a5a4005afc37adea7267f09ba7"} Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.587946 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-cc66f58c7-79z2q"] Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.595639 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-spzmd" podStartSLOduration=2.595627189 podStartE2EDuration="2.595627189s" podCreationTimestamp="2026-01-31 09:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:18:35.582744111 +0000 UTC m=+826.251427579" watchObservedRunningTime="2026-01-31 09:18:35.595627189 +0000 UTC m=+826.264310657" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.606838 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6c24v"] Jan 31 09:18:35 crc kubenswrapper[4783]: W0131 09:18:35.612518 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod349997ee_6053_4d85_8eae_1d4adf3b347e.slice/crio-bc14e919f8b3449414fb735c730884d8728eab6a3837b28384ad0a32a307afa1 WatchSource:0}: Error finding container bc14e919f8b3449414fb735c730884d8728eab6a3837b28384ad0a32a307afa1: Status 404 returned error can't find the container with id bc14e919f8b3449414fb735c730884d8728eab6a3837b28384ad0a32a307afa1 Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.642412 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-zq2kc"] Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.659770 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-zq2kc"] Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.764680 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.789232 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-k2pw7"] Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.798857 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.817541 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5989bc564f-6q4l6"] Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.872379 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5cd96d8cbc-jvn7k"] Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.873539 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.897454 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cd96d8cbc-jvn7k"] Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.926760 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.942420 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e73bb130-1464-433b-b34d-4af489f73b46-logs\") pod \"horizon-5cd96d8cbc-jvn7k\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.942506 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e73bb130-1464-433b-b34d-4af489f73b46-config-data\") pod \"horizon-5cd96d8cbc-jvn7k\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.942562 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e73bb130-1464-433b-b34d-4af489f73b46-horizon-secret-key\") pod \"horizon-5cd96d8cbc-jvn7k\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.942604 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e73bb130-1464-433b-b34d-4af489f73b46-scripts\") pod \"horizon-5cd96d8cbc-jvn7k\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.942667 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmzhz\" (UniqueName: \"kubernetes.io/projected/e73bb130-1464-433b-b34d-4af489f73b46-kube-api-access-nmzhz\") pod \"horizon-5cd96d8cbc-jvn7k\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.953284 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.963152 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7mds6"] Jan 31 09:18:35 crc kubenswrapper[4783]: W0131 09:18:35.965758 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode52e03f3_fa41_487d_affe_89222406f4bb.slice/crio-84fdab98a2affa05717215feb4e4383412a44843ea778db228bbfc3433d763c6 WatchSource:0}: Error finding container 84fdab98a2affa05717215feb4e4383412a44843ea778db228bbfc3433d763c6: Status 404 returned error can't find the container with id 84fdab98a2affa05717215feb4e4383412a44843ea778db228bbfc3433d763c6 Jan 31 09:18:35 crc kubenswrapper[4783]: I0131 09:18:35.987983 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.054563 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.055855 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e73bb130-1464-433b-b34d-4af489f73b46-logs\") pod \"horizon-5cd96d8cbc-jvn7k\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.055904 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e73bb130-1464-433b-b34d-4af489f73b46-config-data\") pod \"horizon-5cd96d8cbc-jvn7k\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.055961 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e73bb130-1464-433b-b34d-4af489f73b46-horizon-secret-key\") pod \"horizon-5cd96d8cbc-jvn7k\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.056004 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e73bb130-1464-433b-b34d-4af489f73b46-scripts\") pod \"horizon-5cd96d8cbc-jvn7k\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.056026 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmzhz\" (UniqueName: \"kubernetes.io/projected/e73bb130-1464-433b-b34d-4af489f73b46-kube-api-access-nmzhz\") pod \"horizon-5cd96d8cbc-jvn7k\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.057579 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e73bb130-1464-433b-b34d-4af489f73b46-config-data\") pod \"horizon-5cd96d8cbc-jvn7k\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.057838 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e73bb130-1464-433b-b34d-4af489f73b46-logs\") pod \"horizon-5cd96d8cbc-jvn7k\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.058495 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e73bb130-1464-433b-b34d-4af489f73b46-scripts\") pod \"horizon-5cd96d8cbc-jvn7k\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.061607 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e73bb130-1464-433b-b34d-4af489f73b46-horizon-secret-key\") pod \"horizon-5cd96d8cbc-jvn7k\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.086634 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmzhz\" (UniqueName: \"kubernetes.io/projected/e73bb130-1464-433b-b34d-4af489f73b46-kube-api-access-nmzhz\") pod \"horizon-5cd96d8cbc-jvn7k\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.194048 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.638803 4783 generic.go:334] "Generic (PLEG): container finished" podID="e28f3ed6-b41b-46f2-822a-0a7a1c79e512" containerID="6774532024d3b31efd144b298c3951c8857ccbe48ce6563e6c761a4dcb09b1c5" exitCode=0 Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.638905 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" event={"ID":"e28f3ed6-b41b-46f2-822a-0a7a1c79e512","Type":"ContainerDied","Data":"6774532024d3b31efd144b298c3951c8857ccbe48ce6563e6c761a4dcb09b1c5"} Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.642693 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7mds6" event={"ID":"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9","Type":"ContainerStarted","Data":"150e5e0164b552e8488ac8a43500e605706978ef8990e67ecf3ee6a5e5f27f0b"} Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.642739 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7mds6" event={"ID":"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9","Type":"ContainerStarted","Data":"ae97364a096f47933f1e641942ae177aef284abbe7e807834b23bdfd4e98be92"} Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.654340 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e52e03f3-fa41-487d-affe-89222406f4bb","Type":"ContainerStarted","Data":"84fdab98a2affa05717215feb4e4383412a44843ea778db228bbfc3433d763c6"} Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.672528 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81c94de6-2854-4f8c-9d50-2467632fa290","Type":"ContainerStarted","Data":"6fa45e3260a1e34f0d7d7a0c846f185e609c64ea50cecf4f1deeee00343c3bfe"} Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.686799 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cc66f58c7-79z2q" event={"ID":"349997ee-6053-4d85-8eae-1d4adf3b347e","Type":"ContainerStarted","Data":"bc14e919f8b3449414fb735c730884d8728eab6a3837b28384ad0a32a307afa1"} Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.688835 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6c24v" event={"ID":"6275a243-2cfc-4f77-a4b5-40a697e309d9","Type":"ContainerStarted","Data":"b33d7436ddd08c683623148f9b35ba95a8238cd60e440a039df8f85844891792"} Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.691514 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k2pw7" event={"ID":"fd0c0937-6461-4221-bbd8-3c7e37bbff9d","Type":"ContainerStarted","Data":"b1bea4ce8f9c74d70b70c2cf7edb881af63d164602649430e59eb6637119de81"} Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.695581 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cd96d8cbc-jvn7k"] Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.698295 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-7mds6" podStartSLOduration=2.6982829280000002 podStartE2EDuration="2.698282928s" podCreationTimestamp="2026-01-31 09:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:18:36.683094135 +0000 UTC m=+827.351777603" watchObservedRunningTime="2026-01-31 09:18:36.698282928 +0000 UTC m=+827.366966396" Jan 31 09:18:36 crc kubenswrapper[4783]: I0131 09:18:36.707426 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e","Type":"ContainerStarted","Data":"afb5f3344ad57551329d1cccec8e3d382bd8d10a9cbc5cf621df6b0751f0874e"} Jan 31 09:18:37 crc kubenswrapper[4783]: I0131 09:18:37.656565 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68667699-5af2-4b17-b31c-b33dfb76798e" path="/var/lib/kubelet/pods/68667699-5af2-4b17-b31c-b33dfb76798e/volumes" Jan 31 09:18:37 crc kubenswrapper[4783]: I0131 09:18:37.740255 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81c94de6-2854-4f8c-9d50-2467632fa290","Type":"ContainerStarted","Data":"033f16f99c5d072377b88f0989d9f1f4d8ab65824d3ca5ea841d80f922630ef0"} Jan 31 09:18:37 crc kubenswrapper[4783]: I0131 09:18:37.744324 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" event={"ID":"e28f3ed6-b41b-46f2-822a-0a7a1c79e512","Type":"ContainerStarted","Data":"b131629e024206f4c4bfc74fbc7f6105d6719304dd28176ca7b139ca386afaae"} Jan 31 09:18:37 crc kubenswrapper[4783]: I0131 09:18:37.744492 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:37 crc kubenswrapper[4783]: I0131 09:18:37.751005 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cd96d8cbc-jvn7k" event={"ID":"e73bb130-1464-433b-b34d-4af489f73b46","Type":"ContainerStarted","Data":"17312f09b941fce56994b0cd5dc31766ee5dbacb8bc604072156d90f5067b760"} Jan 31 09:18:37 crc kubenswrapper[4783]: I0131 09:18:37.756901 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e52e03f3-fa41-487d-affe-89222406f4bb","Type":"ContainerStarted","Data":"97b21fd07638dace24707d4370ee45b51df0bec9b662cbd7dcb21f33a68cbe55"} Jan 31 09:18:37 crc kubenswrapper[4783]: I0131 09:18:37.765651 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" podStartSLOduration=3.765640027 podStartE2EDuration="3.765640027s" podCreationTimestamp="2026-01-31 09:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:18:37.759434212 +0000 UTC m=+828.428117679" watchObservedRunningTime="2026-01-31 09:18:37.765640027 +0000 UTC m=+828.434323494" Jan 31 09:18:39 crc kubenswrapper[4783]: I0131 09:18:39.778572 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e52e03f3-fa41-487d-affe-89222406f4bb","Type":"ContainerStarted","Data":"429bd9032be043a8d2b56acc8f77c1d8d04ae21916462438903d7e0947a0c027"} Jan 31 09:18:39 crc kubenswrapper[4783]: I0131 09:18:39.779328 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e52e03f3-fa41-487d-affe-89222406f4bb" containerName="glance-log" containerID="cri-o://97b21fd07638dace24707d4370ee45b51df0bec9b662cbd7dcb21f33a68cbe55" gracePeriod=30 Jan 31 09:18:39 crc kubenswrapper[4783]: I0131 09:18:39.779608 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e52e03f3-fa41-487d-affe-89222406f4bb" containerName="glance-httpd" containerID="cri-o://429bd9032be043a8d2b56acc8f77c1d8d04ae21916462438903d7e0947a0c027" gracePeriod=30 Jan 31 09:18:39 crc kubenswrapper[4783]: I0131 09:18:39.782192 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81c94de6-2854-4f8c-9d50-2467632fa290","Type":"ContainerStarted","Data":"330fd53d7adbaaf321a25ebf04431aa0b7a95ce7d496ed22d14f64a5a449bd74"} Jan 31 09:18:39 crc kubenswrapper[4783]: I0131 09:18:39.782289 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="81c94de6-2854-4f8c-9d50-2467632fa290" containerName="glance-log" containerID="cri-o://033f16f99c5d072377b88f0989d9f1f4d8ab65824d3ca5ea841d80f922630ef0" gracePeriod=30 Jan 31 09:18:39 crc kubenswrapper[4783]: I0131 09:18:39.782364 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="81c94de6-2854-4f8c-9d50-2467632fa290" containerName="glance-httpd" containerID="cri-o://330fd53d7adbaaf321a25ebf04431aa0b7a95ce7d496ed22d14f64a5a449bd74" gracePeriod=30 Jan 31 09:18:39 crc kubenswrapper[4783]: I0131 09:18:39.822133 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.822111524 podStartE2EDuration="5.822111524s" podCreationTimestamp="2026-01-31 09:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:18:39.820830399 +0000 UTC m=+830.489513867" watchObservedRunningTime="2026-01-31 09:18:39.822111524 +0000 UTC m=+830.490794992" Jan 31 09:18:39 crc kubenswrapper[4783]: I0131 09:18:39.844924 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.844904016 podStartE2EDuration="5.844904016s" podCreationTimestamp="2026-01-31 09:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:18:39.838879593 +0000 UTC m=+830.507563061" watchObservedRunningTime="2026-01-31 09:18:39.844904016 +0000 UTC m=+830.513587484" Jan 31 09:18:40 crc kubenswrapper[4783]: I0131 09:18:40.800484 4783 generic.go:334] "Generic (PLEG): container finished" podID="e52e03f3-fa41-487d-affe-89222406f4bb" containerID="429bd9032be043a8d2b56acc8f77c1d8d04ae21916462438903d7e0947a0c027" exitCode=0 Jan 31 09:18:40 crc kubenswrapper[4783]: I0131 09:18:40.800782 4783 generic.go:334] "Generic (PLEG): container finished" podID="e52e03f3-fa41-487d-affe-89222406f4bb" containerID="97b21fd07638dace24707d4370ee45b51df0bec9b662cbd7dcb21f33a68cbe55" exitCode=143 Jan 31 09:18:40 crc kubenswrapper[4783]: I0131 09:18:40.800565 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e52e03f3-fa41-487d-affe-89222406f4bb","Type":"ContainerDied","Data":"429bd9032be043a8d2b56acc8f77c1d8d04ae21916462438903d7e0947a0c027"} Jan 31 09:18:40 crc kubenswrapper[4783]: I0131 09:18:40.800860 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e52e03f3-fa41-487d-affe-89222406f4bb","Type":"ContainerDied","Data":"97b21fd07638dace24707d4370ee45b51df0bec9b662cbd7dcb21f33a68cbe55"} Jan 31 09:18:40 crc kubenswrapper[4783]: I0131 09:18:40.802887 4783 generic.go:334] "Generic (PLEG): container finished" podID="beacf9d0-1587-4cb5-b9e9-d284ff8b8288" containerID="b29a5cc127c8a868aa3a95c1ff709c7d0c5b580c5d71197fdd3e747de1a870c8" exitCode=0 Jan 31 09:18:40 crc kubenswrapper[4783]: I0131 09:18:40.802979 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-spzmd" event={"ID":"beacf9d0-1587-4cb5-b9e9-d284ff8b8288","Type":"ContainerDied","Data":"b29a5cc127c8a868aa3a95c1ff709c7d0c5b580c5d71197fdd3e747de1a870c8"} Jan 31 09:18:40 crc kubenswrapper[4783]: I0131 09:18:40.808847 4783 generic.go:334] "Generic (PLEG): container finished" podID="81c94de6-2854-4f8c-9d50-2467632fa290" containerID="330fd53d7adbaaf321a25ebf04431aa0b7a95ce7d496ed22d14f64a5a449bd74" exitCode=0 Jan 31 09:18:40 crc kubenswrapper[4783]: I0131 09:18:40.808877 4783 generic.go:334] "Generic (PLEG): container finished" podID="81c94de6-2854-4f8c-9d50-2467632fa290" containerID="033f16f99c5d072377b88f0989d9f1f4d8ab65824d3ca5ea841d80f922630ef0" exitCode=143 Jan 31 09:18:40 crc kubenswrapper[4783]: I0131 09:18:40.808902 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81c94de6-2854-4f8c-9d50-2467632fa290","Type":"ContainerDied","Data":"330fd53d7adbaaf321a25ebf04431aa0b7a95ce7d496ed22d14f64a5a449bd74"} Jan 31 09:18:40 crc kubenswrapper[4783]: I0131 09:18:40.808928 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81c94de6-2854-4f8c-9d50-2467632fa290","Type":"ContainerDied","Data":"033f16f99c5d072377b88f0989d9f1f4d8ab65824d3ca5ea841d80f922630ef0"} Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.425353 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-cc66f58c7-79z2q"] Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.452798 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f5ff596f4-ffmss"] Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.453938 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.457814 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.469807 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f5ff596f4-ffmss"] Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.526117 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cd96d8cbc-jvn7k"] Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.537566 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6644bf8978-q24zg"] Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.538743 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.555419 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6644bf8978-q24zg"] Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.601953 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-logs\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.602010 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dda3593-0628-4253-995b-b662d252462e-config-data\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.602030 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-combined-ca-bundle\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.602049 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-combined-ca-bundle\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.602071 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7mpg\" (UniqueName: \"kubernetes.io/projected/8dda3593-0628-4253-995b-b662d252462e-kube-api-access-m7mpg\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.602098 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-horizon-tls-certs\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.602285 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dda3593-0628-4253-995b-b662d252462e-scripts\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.602357 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-horizon-secret-key\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.602471 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-config-data\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.602516 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-scripts\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.602561 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dda3593-0628-4253-995b-b662d252462e-logs\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.602610 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5s9f\" (UniqueName: \"kubernetes.io/projected/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-kube-api-access-w5s9f\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.602648 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-horizon-tls-certs\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.602715 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-horizon-secret-key\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.705067 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-horizon-secret-key\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.705182 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-logs\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.705250 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dda3593-0628-4253-995b-b662d252462e-config-data\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.705269 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-combined-ca-bundle\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.705310 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-combined-ca-bundle\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.705375 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7mpg\" (UniqueName: \"kubernetes.io/projected/8dda3593-0628-4253-995b-b662d252462e-kube-api-access-m7mpg\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.705616 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-logs\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.706196 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-horizon-tls-certs\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.706334 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dda3593-0628-4253-995b-b662d252462e-scripts\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.706417 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-horizon-secret-key\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.706546 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-config-data\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.706598 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-scripts\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.706643 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dda3593-0628-4253-995b-b662d252462e-logs\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.706677 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5s9f\" (UniqueName: \"kubernetes.io/projected/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-kube-api-access-w5s9f\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.706726 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-horizon-tls-certs\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.707347 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dda3593-0628-4253-995b-b662d252462e-scripts\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.707411 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dda3593-0628-4253-995b-b662d252462e-logs\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.709517 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-scripts\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.709807 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-config-data\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.710155 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dda3593-0628-4253-995b-b662d252462e-config-data\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.712966 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-horizon-tls-certs\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.716751 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-combined-ca-bundle\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.717258 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-combined-ca-bundle\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.717483 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-horizon-secret-key\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.722013 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7mpg\" (UniqueName: \"kubernetes.io/projected/8dda3593-0628-4253-995b-b662d252462e-kube-api-access-m7mpg\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.722070 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-horizon-tls-certs\") pod \"horizon-f5ff596f4-ffmss\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.725068 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-horizon-secret-key\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.733309 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5s9f\" (UniqueName: \"kubernetes.io/projected/940e2e96-d6a1-4576-b83a-e30ff1f6ab85-kube-api-access-w5s9f\") pod \"horizon-6644bf8978-q24zg\" (UID: \"940e2e96-d6a1-4576-b83a-e30ff1f6ab85\") " pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.775142 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:18:42 crc kubenswrapper[4783]: I0131 09:18:42.868049 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:18:43 crc kubenswrapper[4783]: I0131 09:18:43.837149 4783 generic.go:334] "Generic (PLEG): container finished" podID="ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9" containerID="150e5e0164b552e8488ac8a43500e605706978ef8990e67ecf3ee6a5e5f27f0b" exitCode=0 Jan 31 09:18:43 crc kubenswrapper[4783]: I0131 09:18:43.837414 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7mds6" event={"ID":"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9","Type":"ContainerDied","Data":"150e5e0164b552e8488ac8a43500e605706978ef8990e67ecf3ee6a5e5f27f0b"} Jan 31 09:18:44 crc kubenswrapper[4783]: I0131 09:18:44.749378 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:18:44 crc kubenswrapper[4783]: I0131 09:18:44.806124 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-tqvtm"] Jan 31 09:18:44 crc kubenswrapper[4783]: I0131 09:18:44.806422 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" podUID="8be19e32-ed6c-42b2-9bf7-15bec0bc9696" containerName="dnsmasq-dns" containerID="cri-o://0125f737588e8a5cc8c31b4b7fe7d54d51b5b515ea722d0c944867e5027fefc3" gracePeriod=10 Jan 31 09:18:45 crc kubenswrapper[4783]: I0131 09:18:45.860969 4783 generic.go:334] "Generic (PLEG): container finished" podID="8be19e32-ed6c-42b2-9bf7-15bec0bc9696" containerID="0125f737588e8a5cc8c31b4b7fe7d54d51b5b515ea722d0c944867e5027fefc3" exitCode=0 Jan 31 09:18:45 crc kubenswrapper[4783]: I0131 09:18:45.861075 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" event={"ID":"8be19e32-ed6c-42b2-9bf7-15bec0bc9696","Type":"ContainerDied","Data":"0125f737588e8a5cc8c31b4b7fe7d54d51b5b515ea722d0c944867e5027fefc3"} Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.171615 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.280110 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qlhp\" (UniqueName: \"kubernetes.io/projected/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-kube-api-access-7qlhp\") pod \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.280382 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-credential-keys\") pod \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.280520 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-scripts\") pod \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.280599 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-combined-ca-bundle\") pod \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.280702 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-config-data\") pod \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.280874 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-fernet-keys\") pod \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\" (UID: \"beacf9d0-1587-4cb5-b9e9-d284ff8b8288\") " Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.285639 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-scripts" (OuterVolumeSpecName: "scripts") pod "beacf9d0-1587-4cb5-b9e9-d284ff8b8288" (UID: "beacf9d0-1587-4cb5-b9e9-d284ff8b8288"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.289602 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "beacf9d0-1587-4cb5-b9e9-d284ff8b8288" (UID: "beacf9d0-1587-4cb5-b9e9-d284ff8b8288"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.289711 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "beacf9d0-1587-4cb5-b9e9-d284ff8b8288" (UID: "beacf9d0-1587-4cb5-b9e9-d284ff8b8288"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.291589 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-kube-api-access-7qlhp" (OuterVolumeSpecName: "kube-api-access-7qlhp") pod "beacf9d0-1587-4cb5-b9e9-d284ff8b8288" (UID: "beacf9d0-1587-4cb5-b9e9-d284ff8b8288"). InnerVolumeSpecName "kube-api-access-7qlhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.306519 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-config-data" (OuterVolumeSpecName: "config-data") pod "beacf9d0-1587-4cb5-b9e9-d284ff8b8288" (UID: "beacf9d0-1587-4cb5-b9e9-d284ff8b8288"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.307150 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "beacf9d0-1587-4cb5-b9e9-d284ff8b8288" (UID: "beacf9d0-1587-4cb5-b9e9-d284ff8b8288"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.384636 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.384690 4783 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.384705 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qlhp\" (UniqueName: \"kubernetes.io/projected/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-kube-api-access-7qlhp\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.384722 4783 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.384734 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.384744 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beacf9d0-1587-4cb5-b9e9-d284ff8b8288-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.872488 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-spzmd" event={"ID":"beacf9d0-1587-4cb5-b9e9-d284ff8b8288","Type":"ContainerDied","Data":"1a311e43b33858bffb71304e30483b281b8af7a5a4005afc37adea7267f09ba7"} Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.872531 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a311e43b33858bffb71304e30483b281b8af7a5a4005afc37adea7267f09ba7" Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.872565 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-spzmd" Jan 31 09:18:46 crc kubenswrapper[4783]: I0131 09:18:46.912641 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" podUID="8be19e32-ed6c-42b2-9bf7-15bec0bc9696" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.360382 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-spzmd"] Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.365426 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-spzmd"] Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.442238 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gksrm"] Jan 31 09:18:47 crc kubenswrapper[4783]: E0131 09:18:47.442812 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beacf9d0-1587-4cb5-b9e9-d284ff8b8288" containerName="keystone-bootstrap" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.442831 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="beacf9d0-1587-4cb5-b9e9-d284ff8b8288" containerName="keystone-bootstrap" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.442979 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="beacf9d0-1587-4cb5-b9e9-d284ff8b8288" containerName="keystone-bootstrap" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.443584 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.446121 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wzwfs" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.448154 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gksrm"] Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.448335 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.448363 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.448654 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.450042 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.504406 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-fernet-keys\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.504463 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-combined-ca-bundle\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.504666 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8kqb\" (UniqueName: \"kubernetes.io/projected/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-kube-api-access-b8kqb\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.504791 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-scripts\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.504833 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-credential-keys\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.504854 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-config-data\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: E0131 09:18:47.565914 4783 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b" Jan 31 09:18:47 crc kubenswrapper[4783]: E0131 09:18:47.566110 4783 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dqwz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-c5vpg_openstack(be3abc10-848a-448b-a2a2-df825e99a23f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 31 09:18:47 crc kubenswrapper[4783]: E0131 09:18:47.567656 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-c5vpg" podUID="be3abc10-848a-448b-a2a2-df825e99a23f" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.612249 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-scripts\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.612377 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-credential-keys\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.612419 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-config-data\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.612623 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-fernet-keys\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.612732 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-combined-ca-bundle\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.613014 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8kqb\" (UniqueName: \"kubernetes.io/projected/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-kube-api-access-b8kqb\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.621925 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-credential-keys\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.622076 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-config-data\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.624637 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-combined-ca-bundle\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.624961 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-scripts\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.628023 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-fernet-keys\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.629602 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8kqb\" (UniqueName: \"kubernetes.io/projected/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-kube-api-access-b8kqb\") pod \"keystone-bootstrap-gksrm\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.660796 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beacf9d0-1587-4cb5-b9e9-d284ff8b8288" path="/var/lib/kubelet/pods/beacf9d0-1587-4cb5-b9e9-d284ff8b8288/volumes" Jan 31 09:18:47 crc kubenswrapper[4783]: I0131 09:18:47.761656 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:18:47 crc kubenswrapper[4783]: E0131 09:18:47.882583 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b\\\"\"" pod="openstack/placement-db-sync-c5vpg" podUID="be3abc10-848a-448b-a2a2-df825e99a23f" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.044544 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7mds6" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.064266 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.145259 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e52e03f3-fa41-487d-affe-89222406f4bb\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.145350 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-config\") pod \"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9\" (UID: \"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9\") " Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.145415 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-public-tls-certs\") pod \"e52e03f3-fa41-487d-affe-89222406f4bb\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.145490 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qqd\" (UniqueName: \"kubernetes.io/projected/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-kube-api-access-d6qqd\") pod \"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9\" (UID: \"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9\") " Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.145546 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-combined-ca-bundle\") pod \"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9\" (UID: \"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9\") " Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.145591 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e52e03f3-fa41-487d-affe-89222406f4bb-httpd-run\") pod \"e52e03f3-fa41-487d-affe-89222406f4bb\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.145613 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-config-data\") pod \"e52e03f3-fa41-487d-affe-89222406f4bb\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.145632 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-combined-ca-bundle\") pod \"e52e03f3-fa41-487d-affe-89222406f4bb\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.145666 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-scripts\") pod \"e52e03f3-fa41-487d-affe-89222406f4bb\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.145716 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e52e03f3-fa41-487d-affe-89222406f4bb-logs\") pod \"e52e03f3-fa41-487d-affe-89222406f4bb\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.145748 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9nd6\" (UniqueName: \"kubernetes.io/projected/e52e03f3-fa41-487d-affe-89222406f4bb-kube-api-access-k9nd6\") pod \"e52e03f3-fa41-487d-affe-89222406f4bb\" (UID: \"e52e03f3-fa41-487d-affe-89222406f4bb\") " Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.146176 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e52e03f3-fa41-487d-affe-89222406f4bb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e52e03f3-fa41-487d-affe-89222406f4bb" (UID: "e52e03f3-fa41-487d-affe-89222406f4bb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.147843 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e52e03f3-fa41-487d-affe-89222406f4bb-logs" (OuterVolumeSpecName: "logs") pod "e52e03f3-fa41-487d-affe-89222406f4bb" (UID: "e52e03f3-fa41-487d-affe-89222406f4bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.150880 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e52e03f3-fa41-487d-affe-89222406f4bb-kube-api-access-k9nd6" (OuterVolumeSpecName: "kube-api-access-k9nd6") pod "e52e03f3-fa41-487d-affe-89222406f4bb" (UID: "e52e03f3-fa41-487d-affe-89222406f4bb"). InnerVolumeSpecName "kube-api-access-k9nd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.153120 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-scripts" (OuterVolumeSpecName: "scripts") pod "e52e03f3-fa41-487d-affe-89222406f4bb" (UID: "e52e03f3-fa41-487d-affe-89222406f4bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.154443 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-kube-api-access-d6qqd" (OuterVolumeSpecName: "kube-api-access-d6qqd") pod "ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9" (UID: "ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9"). InnerVolumeSpecName "kube-api-access-d6qqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.158084 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "e52e03f3-fa41-487d-affe-89222406f4bb" (UID: "e52e03f3-fa41-487d-affe-89222406f4bb"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.168920 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-config" (OuterVolumeSpecName: "config") pod "ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9" (UID: "ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.169199 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e52e03f3-fa41-487d-affe-89222406f4bb" (UID: "e52e03f3-fa41-487d-affe-89222406f4bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.175456 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9" (UID: "ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.186690 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-config-data" (OuterVolumeSpecName: "config-data") pod "e52e03f3-fa41-487d-affe-89222406f4bb" (UID: "e52e03f3-fa41-487d-affe-89222406f4bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.189786 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e52e03f3-fa41-487d-affe-89222406f4bb" (UID: "e52e03f3-fa41-487d-affe-89222406f4bb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.248044 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qqd\" (UniqueName: \"kubernetes.io/projected/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-kube-api-access-d6qqd\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.248075 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.248087 4783 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e52e03f3-fa41-487d-affe-89222406f4bb-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.248097 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.248106 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.248114 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.248122 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e52e03f3-fa41-487d-affe-89222406f4bb-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.248566 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9nd6\" (UniqueName: \"kubernetes.io/projected/e52e03f3-fa41-487d-affe-89222406f4bb-kube-api-access-k9nd6\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.248660 4783 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.248680 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.248691 4783 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52e03f3-fa41-487d-affe-89222406f4bb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.263054 4783 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.350814 4783 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.915708 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7mds6" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.915680 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7mds6" event={"ID":"ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9","Type":"ContainerDied","Data":"ae97364a096f47933f1e641942ae177aef284abbe7e807834b23bdfd4e98be92"} Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.916231 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae97364a096f47933f1e641942ae177aef284abbe7e807834b23bdfd4e98be92" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.922562 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e52e03f3-fa41-487d-affe-89222406f4bb","Type":"ContainerDied","Data":"84fdab98a2affa05717215feb4e4383412a44843ea778db228bbfc3433d763c6"} Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.922623 4783 scope.go:117] "RemoveContainer" containerID="429bd9032be043a8d2b56acc8f77c1d8d04ae21916462438903d7e0947a0c027" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.922776 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.952453 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.967253 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.972266 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 09:18:49 crc kubenswrapper[4783]: E0131 09:18:49.972792 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e52e03f3-fa41-487d-affe-89222406f4bb" containerName="glance-httpd" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.972815 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52e03f3-fa41-487d-affe-89222406f4bb" containerName="glance-httpd" Jan 31 09:18:49 crc kubenswrapper[4783]: E0131 09:18:49.972826 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e52e03f3-fa41-487d-affe-89222406f4bb" containerName="glance-log" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.972832 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52e03f3-fa41-487d-affe-89222406f4bb" containerName="glance-log" Jan 31 09:18:49 crc kubenswrapper[4783]: E0131 09:18:49.972869 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9" containerName="neutron-db-sync" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.972875 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9" containerName="neutron-db-sync" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.973075 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="e52e03f3-fa41-487d-affe-89222406f4bb" containerName="glance-httpd" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.973090 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="e52e03f3-fa41-487d-affe-89222406f4bb" containerName="glance-log" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.973106 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9" containerName="neutron-db-sync" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.974205 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.975853 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.976604 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 09:18:49 crc kubenswrapper[4783]: I0131 09:18:49.980552 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.071970 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.072097 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-logs\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.072205 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-scripts\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.072229 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.072328 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.072358 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-config-data\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.072383 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.072446 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd8hp\" (UniqueName: \"kubernetes.io/projected/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-kube-api-access-xd8hp\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.174588 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd8hp\" (UniqueName: \"kubernetes.io/projected/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-kube-api-access-xd8hp\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.174703 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.174746 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-logs\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.174783 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-scripts\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.174804 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.174843 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.174862 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-config-data\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.174885 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.175294 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.176430 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-logs\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.176445 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.190687 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.199802 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.214315 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd8hp\" (UniqueName: \"kubernetes.io/projected/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-kube-api-access-xd8hp\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.219688 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-scripts\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.220727 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-config-data\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.253418 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-56zzg"] Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.272953 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.254408 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.292604 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.293338 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-56zzg"] Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.346059 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-548c6d58db-rh9pc"] Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.347626 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.352874 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.354002 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-d6l9f" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.354651 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.356424 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-548c6d58db-rh9pc"] Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.358228 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.386768 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpgp8\" (UniqueName: \"kubernetes.io/projected/8c5f7b62-eadb-483b-b336-34e1ba8e881b-kube-api-access-bpgp8\") pod \"neutron-548c6d58db-rh9pc\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.386899 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-config\") pod \"neutron-548c6d58db-rh9pc\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.386981 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-config\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.387060 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-combined-ca-bundle\") pod \"neutron-548c6d58db-rh9pc\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.387129 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.387224 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.387290 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-httpd-config\") pod \"neutron-548c6d58db-rh9pc\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.387358 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58kqc\" (UniqueName: \"kubernetes.io/projected/f7684372-8d95-457c-b0a7-a58bb2cbf149-kube-api-access-58kqc\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.387449 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-ovndb-tls-certs\") pod \"neutron-548c6d58db-rh9pc\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.387516 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-dns-svc\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.387620 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.489546 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.489595 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.489621 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-httpd-config\") pod \"neutron-548c6d58db-rh9pc\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.489639 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58kqc\" (UniqueName: \"kubernetes.io/projected/f7684372-8d95-457c-b0a7-a58bb2cbf149-kube-api-access-58kqc\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.489669 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-ovndb-tls-certs\") pod \"neutron-548c6d58db-rh9pc\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.489685 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-dns-svc\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.489747 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.489779 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpgp8\" (UniqueName: \"kubernetes.io/projected/8c5f7b62-eadb-483b-b336-34e1ba8e881b-kube-api-access-bpgp8\") pod \"neutron-548c6d58db-rh9pc\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.489812 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-config\") pod \"neutron-548c6d58db-rh9pc\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.489836 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-config\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.489856 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-combined-ca-bundle\") pod \"neutron-548c6d58db-rh9pc\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.490590 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.491147 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.491699 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-dns-svc\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.492897 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.493899 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-config\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.494802 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-combined-ca-bundle\") pod \"neutron-548c6d58db-rh9pc\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.499683 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-httpd-config\") pod \"neutron-548c6d58db-rh9pc\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.507688 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-ovndb-tls-certs\") pod \"neutron-548c6d58db-rh9pc\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.508044 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58kqc\" (UniqueName: \"kubernetes.io/projected/f7684372-8d95-457c-b0a7-a58bb2cbf149-kube-api-access-58kqc\") pod \"dnsmasq-dns-6b9c8b59c-56zzg\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.508115 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-config\") pod \"neutron-548c6d58db-rh9pc\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.510102 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpgp8\" (UniqueName: \"kubernetes.io/projected/8c5f7b62-eadb-483b-b336-34e1ba8e881b-kube-api-access-bpgp8\") pod \"neutron-548c6d58db-rh9pc\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.624463 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:18:50 crc kubenswrapper[4783]: I0131 09:18:50.686678 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:18:51 crc kubenswrapper[4783]: I0131 09:18:51.655244 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e52e03f3-fa41-487d-affe-89222406f4bb" path="/var/lib/kubelet/pods/e52e03f3-fa41-487d-affe-89222406f4bb/volumes" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.342112 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-69d7fc9755-j47fz"] Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.344354 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.346594 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.346680 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.351532 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69d7fc9755-j47fz"] Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.430765 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-public-tls-certs\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.430973 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-config\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.431114 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-ovndb-tls-certs\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.431270 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-combined-ca-bundle\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.431393 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-httpd-config\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.431531 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-internal-tls-certs\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.533893 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-public-tls-certs\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.534433 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-config\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.534523 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl9lm\" (UniqueName: \"kubernetes.io/projected/4276e01a-227a-4370-8b9b-cfc5123aa13d-kube-api-access-cl9lm\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.534557 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-ovndb-tls-certs\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.534616 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-combined-ca-bundle\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.534668 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-httpd-config\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.534704 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-internal-tls-certs\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.540934 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-ovndb-tls-certs\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.540967 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-httpd-config\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.541356 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-combined-ca-bundle\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.541639 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-config\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.542488 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-internal-tls-certs\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.542625 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-public-tls-certs\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.637376 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl9lm\" (UniqueName: \"kubernetes.io/projected/4276e01a-227a-4370-8b9b-cfc5123aa13d-kube-api-access-cl9lm\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.649620 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl9lm\" (UniqueName: \"kubernetes.io/projected/4276e01a-227a-4370-8b9b-cfc5123aa13d-kube-api-access-cl9lm\") pod \"neutron-69d7fc9755-j47fz\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:52 crc kubenswrapper[4783]: I0131 09:18:52.661831 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.221055 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.318623 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvxzp\" (UniqueName: \"kubernetes.io/projected/81c94de6-2854-4f8c-9d50-2467632fa290-kube-api-access-rvxzp\") pod \"81c94de6-2854-4f8c-9d50-2467632fa290\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.318679 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-combined-ca-bundle\") pod \"81c94de6-2854-4f8c-9d50-2467632fa290\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.318764 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-scripts\") pod \"81c94de6-2854-4f8c-9d50-2467632fa290\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.318920 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"81c94de6-2854-4f8c-9d50-2467632fa290\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.319018 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-internal-tls-certs\") pod \"81c94de6-2854-4f8c-9d50-2467632fa290\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.319072 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-config-data\") pod \"81c94de6-2854-4f8c-9d50-2467632fa290\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.319100 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81c94de6-2854-4f8c-9d50-2467632fa290-httpd-run\") pod \"81c94de6-2854-4f8c-9d50-2467632fa290\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.319225 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81c94de6-2854-4f8c-9d50-2467632fa290-logs\") pod \"81c94de6-2854-4f8c-9d50-2467632fa290\" (UID: \"81c94de6-2854-4f8c-9d50-2467632fa290\") " Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.319988 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81c94de6-2854-4f8c-9d50-2467632fa290-logs" (OuterVolumeSpecName: "logs") pod "81c94de6-2854-4f8c-9d50-2467632fa290" (UID: "81c94de6-2854-4f8c-9d50-2467632fa290"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.321966 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81c94de6-2854-4f8c-9d50-2467632fa290-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "81c94de6-2854-4f8c-9d50-2467632fa290" (UID: "81c94de6-2854-4f8c-9d50-2467632fa290"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.322776 4783 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81c94de6-2854-4f8c-9d50-2467632fa290-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.322818 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81c94de6-2854-4f8c-9d50-2467632fa290-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.325641 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-scripts" (OuterVolumeSpecName: "scripts") pod "81c94de6-2854-4f8c-9d50-2467632fa290" (UID: "81c94de6-2854-4f8c-9d50-2467632fa290"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.328031 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "81c94de6-2854-4f8c-9d50-2467632fa290" (UID: "81c94de6-2854-4f8c-9d50-2467632fa290"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.328764 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81c94de6-2854-4f8c-9d50-2467632fa290-kube-api-access-rvxzp" (OuterVolumeSpecName: "kube-api-access-rvxzp") pod "81c94de6-2854-4f8c-9d50-2467632fa290" (UID: "81c94de6-2854-4f8c-9d50-2467632fa290"). InnerVolumeSpecName "kube-api-access-rvxzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.342430 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81c94de6-2854-4f8c-9d50-2467632fa290" (UID: "81c94de6-2854-4f8c-9d50-2467632fa290"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.357868 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "81c94de6-2854-4f8c-9d50-2467632fa290" (UID: "81c94de6-2854-4f8c-9d50-2467632fa290"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.362089 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-config-data" (OuterVolumeSpecName: "config-data") pod "81c94de6-2854-4f8c-9d50-2467632fa290" (UID: "81c94de6-2854-4f8c-9d50-2467632fa290"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.424372 4783 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.424416 4783 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.424434 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.424444 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvxzp\" (UniqueName: \"kubernetes.io/projected/81c94de6-2854-4f8c-9d50-2467632fa290-kube-api-access-rvxzp\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.424457 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.424467 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81c94de6-2854-4f8c-9d50-2467632fa290-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.441732 4783 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.525793 4783 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:56 crc kubenswrapper[4783]: E0131 09:18:56.644628 4783 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16" Jan 31 09:18:56 crc kubenswrapper[4783]: E0131 09:18:56.645055 4783 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9sqcn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-k2pw7_openstack(fd0c0937-6461-4221-bbd8-3c7e37bbff9d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 31 09:18:56 crc kubenswrapper[4783]: E0131 09:18:56.646395 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-k2pw7" podUID="fd0c0937-6461-4221-bbd8-3c7e37bbff9d" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.704500 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.731739 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-dns-swift-storage-0\") pod \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.731845 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-ovsdbserver-sb\") pod \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.731894 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-config\") pod \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.731933 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-dns-svc\") pod \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.732025 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gszsx\" (UniqueName: \"kubernetes.io/projected/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-kube-api-access-gszsx\") pod \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.732040 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-ovsdbserver-nb\") pod \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\" (UID: \"8be19e32-ed6c-42b2-9bf7-15bec0bc9696\") " Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.736904 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-kube-api-access-gszsx" (OuterVolumeSpecName: "kube-api-access-gszsx") pod "8be19e32-ed6c-42b2-9bf7-15bec0bc9696" (UID: "8be19e32-ed6c-42b2-9bf7-15bec0bc9696"). InnerVolumeSpecName "kube-api-access-gszsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.766381 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8be19e32-ed6c-42b2-9bf7-15bec0bc9696" (UID: "8be19e32-ed6c-42b2-9bf7-15bec0bc9696"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.768323 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8be19e32-ed6c-42b2-9bf7-15bec0bc9696" (UID: "8be19e32-ed6c-42b2-9bf7-15bec0bc9696"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.772837 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8be19e32-ed6c-42b2-9bf7-15bec0bc9696" (UID: "8be19e32-ed6c-42b2-9bf7-15bec0bc9696"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.779496 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8be19e32-ed6c-42b2-9bf7-15bec0bc9696" (UID: "8be19e32-ed6c-42b2-9bf7-15bec0bc9696"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.782194 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-config" (OuterVolumeSpecName: "config") pod "8be19e32-ed6c-42b2-9bf7-15bec0bc9696" (UID: "8be19e32-ed6c-42b2-9bf7-15bec0bc9696"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.835223 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gszsx\" (UniqueName: \"kubernetes.io/projected/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-kube-api-access-gszsx\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.835282 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.835297 4783 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.835309 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.835323 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.835375 4783 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be19e32-ed6c-42b2-9bf7-15bec0bc9696-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.912435 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" podUID="8be19e32-ed6c-42b2-9bf7-15bec0bc9696" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: i/o timeout" Jan 31 09:18:56 crc kubenswrapper[4783]: I0131 09:18:56.999729 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" event={"ID":"8be19e32-ed6c-42b2-9bf7-15bec0bc9696","Type":"ContainerDied","Data":"8e5248546510818a27ad43090af2118e9fe46b745ac3e5ab08f497e2f97a43a4"} Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.000314 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c9bc6f5c-tqvtm" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.004719 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81c94de6-2854-4f8c-9d50-2467632fa290","Type":"ContainerDied","Data":"6fa45e3260a1e34f0d7d7a0c846f185e609c64ea50cecf4f1deeee00343c3bfe"} Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.004782 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: E0131 09:18:57.008344 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16\\\"\"" pod="openstack/barbican-db-sync-k2pw7" podUID="fd0c0937-6461-4221-bbd8-3c7e37bbff9d" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.067821 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.071231 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.077279 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-tqvtm"] Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.083637 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-tqvtm"] Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.088930 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 09:18:57 crc kubenswrapper[4783]: E0131 09:18:57.089303 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be19e32-ed6c-42b2-9bf7-15bec0bc9696" containerName="init" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.089317 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be19e32-ed6c-42b2-9bf7-15bec0bc9696" containerName="init" Jan 31 09:18:57 crc kubenswrapper[4783]: E0131 09:18:57.089490 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c94de6-2854-4f8c-9d50-2467632fa290" containerName="glance-httpd" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.089500 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c94de6-2854-4f8c-9d50-2467632fa290" containerName="glance-httpd" Jan 31 09:18:57 crc kubenswrapper[4783]: E0131 09:18:57.089517 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be19e32-ed6c-42b2-9bf7-15bec0bc9696" containerName="dnsmasq-dns" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.089523 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be19e32-ed6c-42b2-9bf7-15bec0bc9696" containerName="dnsmasq-dns" Jan 31 09:18:57 crc kubenswrapper[4783]: E0131 09:18:57.089535 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c94de6-2854-4f8c-9d50-2467632fa290" containerName="glance-log" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.089540 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c94de6-2854-4f8c-9d50-2467632fa290" containerName="glance-log" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.089714 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="81c94de6-2854-4f8c-9d50-2467632fa290" containerName="glance-log" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.089734 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="81c94de6-2854-4f8c-9d50-2467632fa290" containerName="glance-httpd" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.089744 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be19e32-ed6c-42b2-9bf7-15bec0bc9696" containerName="dnsmasq-dns" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.090662 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.093443 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.093700 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.108471 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.140075 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkqw7\" (UniqueName: \"kubernetes.io/projected/95d6fbb2-49fd-4d85-8247-413da61c4c16-kube-api-access-zkqw7\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.140295 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.140363 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.140419 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-config-data\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.141120 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-scripts\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.141373 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95d6fbb2-49fd-4d85-8247-413da61c4c16-logs\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.141568 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95d6fbb2-49fd-4d85-8247-413da61c4c16-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.141617 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.243702 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.243813 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkqw7\" (UniqueName: \"kubernetes.io/projected/95d6fbb2-49fd-4d85-8247-413da61c4c16-kube-api-access-zkqw7\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.244007 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.244050 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.244082 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-config-data\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.244125 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-scripts\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.244208 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95d6fbb2-49fd-4d85-8247-413da61c4c16-logs\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.244279 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95d6fbb2-49fd-4d85-8247-413da61c4c16-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.244921 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95d6fbb2-49fd-4d85-8247-413da61c4c16-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.246185 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95d6fbb2-49fd-4d85-8247-413da61c4c16-logs\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.246260 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.250960 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.251079 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.251883 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-scripts\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.253913 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-config-data\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.259232 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkqw7\" (UniqueName: \"kubernetes.io/projected/95d6fbb2-49fd-4d85-8247-413da61c4c16-kube-api-access-zkqw7\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.271249 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.410666 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.655817 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81c94de6-2854-4f8c-9d50-2467632fa290" path="/var/lib/kubelet/pods/81c94de6-2854-4f8c-9d50-2467632fa290/volumes" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.656560 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8be19e32-ed6c-42b2-9bf7-15bec0bc9696" path="/var/lib/kubelet/pods/8be19e32-ed6c-42b2-9bf7-15bec0bc9696/volumes" Jan 31 09:18:57 crc kubenswrapper[4783]: E0131 09:18:57.793267 4783 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49" Jan 31 09:18:57 crc kubenswrapper[4783]: E0131 09:18:57.793476 4783 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hss89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-6c24v_openstack(6275a243-2cfc-4f77-a4b5-40a697e309d9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 31 09:18:57 crc kubenswrapper[4783]: E0131 09:18:57.794899 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-6c24v" podUID="6275a243-2cfc-4f77-a4b5-40a697e309d9" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.810705 4783 scope.go:117] "RemoveContainer" containerID="97b21fd07638dace24707d4370ee45b51df0bec9b662cbd7dcb21f33a68cbe55" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.920603 4783 scope.go:117] "RemoveContainer" containerID="0125f737588e8a5cc8c31b4b7fe7d54d51b5b515ea722d0c944867e5027fefc3" Jan 31 09:18:57 crc kubenswrapper[4783]: I0131 09:18:57.971969 4783 scope.go:117] "RemoveContainer" containerID="fe918a56b93fdc1f4a8d52322f395b62f7e5df453de0c2de89257cfe97572e60" Jan 31 09:18:58 crc kubenswrapper[4783]: I0131 09:18:58.027607 4783 scope.go:117] "RemoveContainer" containerID="330fd53d7adbaaf321a25ebf04431aa0b7a95ce7d496ed22d14f64a5a449bd74" Jan 31 09:18:58 crc kubenswrapper[4783]: E0131 09:18:58.027764 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49\\\"\"" pod="openstack/cinder-db-sync-6c24v" podUID="6275a243-2cfc-4f77-a4b5-40a697e309d9" Jan 31 09:18:58 crc kubenswrapper[4783]: I0131 09:18:58.074045 4783 scope.go:117] "RemoveContainer" containerID="033f16f99c5d072377b88f0989d9f1f4d8ab65824d3ca5ea841d80f922630ef0" Jan 31 09:18:58 crc kubenswrapper[4783]: I0131 09:18:58.222567 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6644bf8978-q24zg"] Jan 31 09:18:58 crc kubenswrapper[4783]: I0131 09:18:58.379608 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f5ff596f4-ffmss"] Jan 31 09:18:58 crc kubenswrapper[4783]: I0131 09:18:58.465944 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gksrm"] Jan 31 09:18:58 crc kubenswrapper[4783]: W0131 09:18:58.473366 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7684372_8d95_457c_b0a7_a58bb2cbf149.slice/crio-be6df1a540b61707e1f16c5a78fe91cf64c57fd14bf22e8ff7fe728f66919976 WatchSource:0}: Error finding container be6df1a540b61707e1f16c5a78fe91cf64c57fd14bf22e8ff7fe728f66919976: Status 404 returned error can't find the container with id be6df1a540b61707e1f16c5a78fe91cf64c57fd14bf22e8ff7fe728f66919976 Jan 31 09:18:58 crc kubenswrapper[4783]: I0131 09:18:58.474799 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-56zzg"] Jan 31 09:18:58 crc kubenswrapper[4783]: W0131 09:18:58.479629 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1106ff1_8d0e_4e9f_ba62_d6a443279e0e.slice/crio-7709077033b7c821ebcb6003b18f1573407de5069371e3cdfbcd1f089c710269 WatchSource:0}: Error finding container 7709077033b7c821ebcb6003b18f1573407de5069371e3cdfbcd1f089c710269: Status 404 returned error can't find the container with id 7709077033b7c821ebcb6003b18f1573407de5069371e3cdfbcd1f089c710269 Jan 31 09:18:58 crc kubenswrapper[4783]: I0131 09:18:58.495375 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 31 09:18:58 crc kubenswrapper[4783]: I0131 09:18:58.532915 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69d7fc9755-j47fz"] Jan 31 09:18:58 crc kubenswrapper[4783]: I0131 09:18:58.643238 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 09:18:58 crc kubenswrapper[4783]: I0131 09:18:58.729451 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 09:18:58 crc kubenswrapper[4783]: W0131 09:18:58.735013 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95d6fbb2_49fd_4d85_8247_413da61c4c16.slice/crio-1c9220263ededa3a8891d0c4a04034a3da6fa299dcdbcba6c831dcdb34a47597 WatchSource:0}: Error finding container 1c9220263ededa3a8891d0c4a04034a3da6fa299dcdbcba6c831dcdb34a47597: Status 404 returned error can't find the container with id 1c9220263ededa3a8891d0c4a04034a3da6fa299dcdbcba6c831dcdb34a47597 Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.051553 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6644bf8978-q24zg" event={"ID":"940e2e96-d6a1-4576-b83a-e30ff1f6ab85","Type":"ContainerStarted","Data":"9c231f1e2d981e3f398b2e6709894b43852d40d1a2eb349d543720d95c582bbe"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.051774 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6644bf8978-q24zg" event={"ID":"940e2e96-d6a1-4576-b83a-e30ff1f6ab85","Type":"ContainerStarted","Data":"8dab8067b2342985b0712466a6da7987365c3e1bf3b96c313cc7c92ca7ba0ee4"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.051788 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6644bf8978-q24zg" event={"ID":"940e2e96-d6a1-4576-b83a-e30ff1f6ab85","Type":"ContainerStarted","Data":"1319e9808ab728037406ddacc3fe2693f68f9e89517c16edc317edeb2f04f1c2"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.055347 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gksrm" event={"ID":"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e","Type":"ContainerStarted","Data":"03f3988b3e5c6a5472626eea137ca26b65b5e4f8cd826d1c8aa1c55b189b9f82"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.055421 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gksrm" event={"ID":"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e","Type":"ContainerStarted","Data":"7709077033b7c821ebcb6003b18f1573407de5069371e3cdfbcd1f089c710269"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.058431 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cc66f58c7-79z2q" event={"ID":"349997ee-6053-4d85-8eae-1d4adf3b347e","Type":"ContainerStarted","Data":"408c4aa1ce013f629cebff83c2e2cac264dcd1ec753f8bffec9f7d71def3f984"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.058467 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cc66f58c7-79z2q" event={"ID":"349997ee-6053-4d85-8eae-1d4adf3b347e","Type":"ContainerStarted","Data":"c0759064018d6758926b71f9e8597f9a998bd45b9e23a808b5eaa8ff4d6aeef6"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.058481 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-cc66f58c7-79z2q" podUID="349997ee-6053-4d85-8eae-1d4adf3b347e" containerName="horizon-log" containerID="cri-o://c0759064018d6758926b71f9e8597f9a998bd45b9e23a808b5eaa8ff4d6aeef6" gracePeriod=30 Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.058523 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-cc66f58c7-79z2q" podUID="349997ee-6053-4d85-8eae-1d4adf3b347e" containerName="horizon" containerID="cri-o://408c4aa1ce013f629cebff83c2e2cac264dcd1ec753f8bffec9f7d71def3f984" gracePeriod=30 Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.064303 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95d6fbb2-49fd-4d85-8247-413da61c4c16","Type":"ContainerStarted","Data":"1c9220263ededa3a8891d0c4a04034a3da6fa299dcdbcba6c831dcdb34a47597"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.084084 4783 generic.go:334] "Generic (PLEG): container finished" podID="f7684372-8d95-457c-b0a7-a58bb2cbf149" containerID="e461a5b229024f7277c023c14030616b12adaa3ce920bcf93ee57a15d8a92425" exitCode=0 Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.084148 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" event={"ID":"f7684372-8d95-457c-b0a7-a58bb2cbf149","Type":"ContainerDied","Data":"e461a5b229024f7277c023c14030616b12adaa3ce920bcf93ee57a15d8a92425"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.084187 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" event={"ID":"f7684372-8d95-457c-b0a7-a58bb2cbf149","Type":"ContainerStarted","Data":"be6df1a540b61707e1f16c5a78fe91cf64c57fd14bf22e8ff7fe728f66919976"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.088136 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cd96d8cbc-jvn7k" event={"ID":"e73bb130-1464-433b-b34d-4af489f73b46","Type":"ContainerStarted","Data":"fb71f926c1a6ece1f05a32110e077eaf0f05f8ec37d922bbb2a1088d409bd733"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.088215 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cd96d8cbc-jvn7k" podUID="e73bb130-1464-433b-b34d-4af489f73b46" containerName="horizon-log" containerID="cri-o://9bdb2e062a7ad48e696ef622530639ab29e48c755d8a1d9c67c9433bfcc380cd" gracePeriod=30 Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.088497 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cd96d8cbc-jvn7k" podUID="e73bb130-1464-433b-b34d-4af489f73b46" containerName="horizon" containerID="cri-o://fb71f926c1a6ece1f05a32110e077eaf0f05f8ec37d922bbb2a1088d409bd733" gracePeriod=30 Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.090140 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cd96d8cbc-jvn7k" event={"ID":"e73bb130-1464-433b-b34d-4af489f73b46","Type":"ContainerStarted","Data":"9bdb2e062a7ad48e696ef622530639ab29e48c755d8a1d9c67c9433bfcc380cd"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.098463 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5989bc564f-6q4l6" event={"ID":"89312075-7597-4743-b92c-58411b26f1ec","Type":"ContainerStarted","Data":"5837f58fda0b78176bc8be8853f671a14f27ac76e9d7dad37189349edab34ccd"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.098490 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5989bc564f-6q4l6" event={"ID":"89312075-7597-4743-b92c-58411b26f1ec","Type":"ContainerStarted","Data":"f463ce51fb0c33bcc2b49a6a026db8c692b2c493624d349193048fd4dc640e47"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.098611 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5989bc564f-6q4l6" podUID="89312075-7597-4743-b92c-58411b26f1ec" containerName="horizon-log" containerID="cri-o://f463ce51fb0c33bcc2b49a6a026db8c692b2c493624d349193048fd4dc640e47" gracePeriod=30 Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.098697 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5989bc564f-6q4l6" podUID="89312075-7597-4743-b92c-58411b26f1ec" containerName="horizon" containerID="cri-o://5837f58fda0b78176bc8be8853f671a14f27ac76e9d7dad37189349edab34ccd" gracePeriod=30 Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.103363 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7","Type":"ContainerStarted","Data":"4ac707186d7e224d05566301d7bd1db3bc8c84ded3a15f31feb1122387893f31"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.107216 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d7fc9755-j47fz" event={"ID":"4276e01a-227a-4370-8b9b-cfc5123aa13d","Type":"ContainerStarted","Data":"0aaee8a39b35d3d99a54863e0fd33968badf4049ade7b35aaf9d948501d50975"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.107237 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d7fc9755-j47fz" event={"ID":"4276e01a-227a-4370-8b9b-cfc5123aa13d","Type":"ContainerStarted","Data":"1bc37110440c86d5cc116d30e5c29f9eb0e976e736f5032c587e80e0cac0b85f"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.108500 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.114177 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6644bf8978-q24zg" podStartSLOduration=17.114151352 podStartE2EDuration="17.114151352s" podCreationTimestamp="2026-01-31 09:18:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:18:59.076183158 +0000 UTC m=+849.744866625" watchObservedRunningTime="2026-01-31 09:18:59.114151352 +0000 UTC m=+849.782834820" Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.121378 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e","Type":"ContainerStarted","Data":"3adbf3400e228d9066ac644318558c4c47c5c19173652a94f1aa77cf80a3adc6"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.131893 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-cc66f58c7-79z2q" podStartSLOduration=2.8994727129999998 podStartE2EDuration="25.131883469s" podCreationTimestamp="2026-01-31 09:18:34 +0000 UTC" firstStartedPulling="2026-01-31 09:18:35.615153307 +0000 UTC m=+826.283836775" lastFinishedPulling="2026-01-31 09:18:57.847564064 +0000 UTC m=+848.516247531" observedRunningTime="2026-01-31 09:18:59.090302665 +0000 UTC m=+849.758986133" watchObservedRunningTime="2026-01-31 09:18:59.131883469 +0000 UTC m=+849.800566937" Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.133298 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f5ff596f4-ffmss" event={"ID":"8dda3593-0628-4253-995b-b662d252462e","Type":"ContainerStarted","Data":"fdb0556ca619eb1273ab3de64979e36b7cd85f9ae88fac7da9dbe1278cce5458"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.133327 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f5ff596f4-ffmss" event={"ID":"8dda3593-0628-4253-995b-b662d252462e","Type":"ContainerStarted","Data":"01d315f893e2e65f63bc946effea328252794d35866a0d1cd4cd689f197bbbb2"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.133339 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f5ff596f4-ffmss" event={"ID":"8dda3593-0628-4253-995b-b662d252462e","Type":"ContainerStarted","Data":"1eb81414a433df5bc0455962382e90f25a71ff960b75ec7400c8715af6aedb23"} Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.135147 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gksrm" podStartSLOduration=12.135141671 podStartE2EDuration="12.135141671s" podCreationTimestamp="2026-01-31 09:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:18:59.109993494 +0000 UTC m=+849.778676961" watchObservedRunningTime="2026-01-31 09:18:59.135141671 +0000 UTC m=+849.803825139" Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.140381 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5989bc564f-6q4l6" podStartSLOduration=3.644561038 podStartE2EDuration="25.140373915s" podCreationTimestamp="2026-01-31 09:18:34 +0000 UTC" firstStartedPulling="2026-01-31 09:18:35.163209242 +0000 UTC m=+825.831892700" lastFinishedPulling="2026-01-31 09:18:56.659022109 +0000 UTC m=+847.327705577" observedRunningTime="2026-01-31 09:18:59.12178361 +0000 UTC m=+849.790467078" watchObservedRunningTime="2026-01-31 09:18:59.140373915 +0000 UTC m=+849.809057383" Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.147431 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5cd96d8cbc-jvn7k" podStartSLOduration=3.027133265 podStartE2EDuration="24.147424275s" podCreationTimestamp="2026-01-31 09:18:35 +0000 UTC" firstStartedPulling="2026-01-31 09:18:36.723357151 +0000 UTC m=+827.392040619" lastFinishedPulling="2026-01-31 09:18:57.843648171 +0000 UTC m=+848.512331629" observedRunningTime="2026-01-31 09:18:59.144156385 +0000 UTC m=+849.812839853" watchObservedRunningTime="2026-01-31 09:18:59.147424275 +0000 UTC m=+849.816107743" Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.216224 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-548c6d58db-rh9pc"] Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.256929 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-69d7fc9755-j47fz" podStartSLOduration=7.256914308 podStartE2EDuration="7.256914308s" podCreationTimestamp="2026-01-31 09:18:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:18:59.215791316 +0000 UTC m=+849.884474784" watchObservedRunningTime="2026-01-31 09:18:59.256914308 +0000 UTC m=+849.925597776" Jan 31 09:18:59 crc kubenswrapper[4783]: I0131 09:18:59.266562 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-f5ff596f4-ffmss" podStartSLOduration=17.266553808 podStartE2EDuration="17.266553808s" podCreationTimestamp="2026-01-31 09:18:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:18:59.241663818 +0000 UTC m=+849.910347286" watchObservedRunningTime="2026-01-31 09:18:59.266553808 +0000 UTC m=+849.935237277" Jan 31 09:18:59 crc kubenswrapper[4783]: E0131 09:18:59.883119 4783 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode52e03f3_fa41_487d_affe_89222406f4bb.slice/crio-84fdab98a2affa05717215feb4e4383412a44843ea778db228bbfc3433d763c6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca7e6f7a_4b59_42fd_9ef2_4f761e2d0af9.slice\": RecentStats: unable to find data in memory cache]" Jan 31 09:19:00 crc kubenswrapper[4783]: I0131 09:19:00.156208 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d7fc9755-j47fz" event={"ID":"4276e01a-227a-4370-8b9b-cfc5123aa13d","Type":"ContainerStarted","Data":"63fb2d8722bb74c63b5eb99153805cee774c4fe20b37ba7479fec958c8e8da20"} Jan 31 09:19:00 crc kubenswrapper[4783]: I0131 09:19:00.161744 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95d6fbb2-49fd-4d85-8247-413da61c4c16","Type":"ContainerStarted","Data":"7b93ba43db8cca037d4ec83f5e78efbed6d4e07779129bd77b31e7f73945096c"} Jan 31 09:19:00 crc kubenswrapper[4783]: I0131 09:19:00.165842 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" event={"ID":"f7684372-8d95-457c-b0a7-a58bb2cbf149","Type":"ContainerStarted","Data":"184b1f0e049c40968bb5d4e9e2e7f3bf901d04dfae41d532236bdd87f80082af"} Jan 31 09:19:00 crc kubenswrapper[4783]: I0131 09:19:00.167509 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:19:00 crc kubenswrapper[4783]: I0131 09:19:00.171749 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-548c6d58db-rh9pc" event={"ID":"8c5f7b62-eadb-483b-b336-34e1ba8e881b","Type":"ContainerStarted","Data":"20aefd71d61b3c915fbf37bbf3e29db31841607652a4d8ed8d8fad64aa04c572"} Jan 31 09:19:00 crc kubenswrapper[4783]: I0131 09:19:00.171786 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-548c6d58db-rh9pc" event={"ID":"8c5f7b62-eadb-483b-b336-34e1ba8e881b","Type":"ContainerStarted","Data":"4970428ea9d39e24a6102c5ea689c00f52b2dc12fcf95f98404c180c4af9d184"} Jan 31 09:19:00 crc kubenswrapper[4783]: I0131 09:19:00.174369 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7","Type":"ContainerStarted","Data":"c62b41e75fbb1a433cfb1c2dff030c8b66835350f6bab80fb2356f17562c3834"} Jan 31 09:19:00 crc kubenswrapper[4783]: I0131 09:19:00.192587 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" podStartSLOduration=10.192574839 podStartE2EDuration="10.192574839s" podCreationTimestamp="2026-01-31 09:18:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:00.185801692 +0000 UTC m=+850.854485160" watchObservedRunningTime="2026-01-31 09:19:00.192574839 +0000 UTC m=+850.861258307" Jan 31 09:19:01 crc kubenswrapper[4783]: I0131 09:19:01.182997 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95d6fbb2-49fd-4d85-8247-413da61c4c16","Type":"ContainerStarted","Data":"467bb4b3b850268e9cc8f923dd2680181e0341456bb9481b712e6537c005e9b3"} Jan 31 09:19:01 crc kubenswrapper[4783]: I0131 09:19:01.185006 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-548c6d58db-rh9pc" event={"ID":"8c5f7b62-eadb-483b-b336-34e1ba8e881b","Type":"ContainerStarted","Data":"eeeb2276e4b6fb5bc70d48279133cb29782cf70016c2a9c62c5c25a1d4f94cd2"} Jan 31 09:19:01 crc kubenswrapper[4783]: I0131 09:19:01.185118 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:19:01 crc kubenswrapper[4783]: I0131 09:19:01.187052 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7","Type":"ContainerStarted","Data":"5b01fff520a9d67f2f3f01f9adc270009f597046e676c124e99d31cbbc8438f3"} Jan 31 09:19:01 crc kubenswrapper[4783]: I0131 09:19:01.188829 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e","Type":"ContainerStarted","Data":"adcab2008d1698aeceedc72c8095b7356b8d7c69ce11c1f9ac18f86d09ddbe7a"} Jan 31 09:19:01 crc kubenswrapper[4783]: I0131 09:19:01.260930 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.260897336 podStartE2EDuration="4.260897336s" podCreationTimestamp="2026-01-31 09:18:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:01.215657253 +0000 UTC m=+851.884340710" watchObservedRunningTime="2026-01-31 09:19:01.260897336 +0000 UTC m=+851.929580804" Jan 31 09:19:01 crc kubenswrapper[4783]: I0131 09:19:01.266957 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-548c6d58db-rh9pc" podStartSLOduration=11.266947561 podStartE2EDuration="11.266947561s" podCreationTimestamp="2026-01-31 09:18:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:01.241758927 +0000 UTC m=+851.910442395" watchObservedRunningTime="2026-01-31 09:19:01.266947561 +0000 UTC m=+851.935631029" Jan 31 09:19:01 crc kubenswrapper[4783]: I0131 09:19:01.270854 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=12.270845228 podStartE2EDuration="12.270845228s" podCreationTimestamp="2026-01-31 09:18:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:01.260246859 +0000 UTC m=+851.928930327" watchObservedRunningTime="2026-01-31 09:19:01.270845228 +0000 UTC m=+851.939528696" Jan 31 09:19:02 crc kubenswrapper[4783]: I0131 09:19:02.197843 4783 generic.go:334] "Generic (PLEG): container finished" podID="e1106ff1-8d0e-4e9f-ba62-d6a443279e0e" containerID="03f3988b3e5c6a5472626eea137ca26b65b5e4f8cd826d1c8aa1c55b189b9f82" exitCode=0 Jan 31 09:19:02 crc kubenswrapper[4783]: I0131 09:19:02.198576 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gksrm" event={"ID":"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e","Type":"ContainerDied","Data":"03f3988b3e5c6a5472626eea137ca26b65b5e4f8cd826d1c8aa1c55b189b9f82"} Jan 31 09:19:02 crc kubenswrapper[4783]: I0131 09:19:02.775789 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:19:02 crc kubenswrapper[4783]: I0131 09:19:02.776096 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:19:02 crc kubenswrapper[4783]: I0131 09:19:02.868728 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:19:02 crc kubenswrapper[4783]: I0131 09:19:02.868834 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:19:03 crc kubenswrapper[4783]: I0131 09:19:03.210062 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-c5vpg" event={"ID":"be3abc10-848a-448b-a2a2-df825e99a23f","Type":"ContainerStarted","Data":"1a4cb936a7541f68ef79f19c0624411d54eb926fab985b9167b8322f644300b8"} Jan 31 09:19:03 crc kubenswrapper[4783]: I0131 09:19:03.232086 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-c5vpg" podStartSLOduration=2.359363351 podStartE2EDuration="29.23206822s" podCreationTimestamp="2026-01-31 09:18:34 +0000 UTC" firstStartedPulling="2026-01-31 09:18:35.387624174 +0000 UTC m=+826.056307643" lastFinishedPulling="2026-01-31 09:19:02.260329044 +0000 UTC m=+852.929012512" observedRunningTime="2026-01-31 09:19:03.229447348 +0000 UTC m=+853.898130816" watchObservedRunningTime="2026-01-31 09:19:03.23206822 +0000 UTC m=+853.900751687" Jan 31 09:19:04 crc kubenswrapper[4783]: I0131 09:19:04.217987 4783 generic.go:334] "Generic (PLEG): container finished" podID="be3abc10-848a-448b-a2a2-df825e99a23f" containerID="1a4cb936a7541f68ef79f19c0624411d54eb926fab985b9167b8322f644300b8" exitCode=0 Jan 31 09:19:04 crc kubenswrapper[4783]: I0131 09:19:04.218079 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-c5vpg" event={"ID":"be3abc10-848a-448b-a2a2-df825e99a23f","Type":"ContainerDied","Data":"1a4cb936a7541f68ef79f19c0624411d54eb926fab985b9167b8322f644300b8"} Jan 31 09:19:04 crc kubenswrapper[4783]: I0131 09:19:04.534907 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:19:04 crc kubenswrapper[4783]: I0131 09:19:04.764674 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:19:05 crc kubenswrapper[4783]: I0131 09:19:05.626217 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:19:05 crc kubenswrapper[4783]: I0131 09:19:05.701800 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-tlndp"] Jan 31 09:19:05 crc kubenswrapper[4783]: I0131 09:19:05.702063 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" podUID="e28f3ed6-b41b-46f2-822a-0a7a1c79e512" containerName="dnsmasq-dns" containerID="cri-o://b131629e024206f4c4bfc74fbc7f6105d6719304dd28176ca7b139ca386afaae" gracePeriod=10 Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.194259 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.749617 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.761797 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-c5vpg" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.848697 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-scripts\") pod \"be3abc10-848a-448b-a2a2-df825e99a23f\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.848743 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8kqb\" (UniqueName: \"kubernetes.io/projected/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-kube-api-access-b8kqb\") pod \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.848817 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-combined-ca-bundle\") pod \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.848861 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be3abc10-848a-448b-a2a2-df825e99a23f-logs\") pod \"be3abc10-848a-448b-a2a2-df825e99a23f\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.848897 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-scripts\") pod \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.848948 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqwz5\" (UniqueName: \"kubernetes.io/projected/be3abc10-848a-448b-a2a2-df825e99a23f-kube-api-access-dqwz5\") pod \"be3abc10-848a-448b-a2a2-df825e99a23f\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.848993 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-fernet-keys\") pod \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.849032 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-combined-ca-bundle\") pod \"be3abc10-848a-448b-a2a2-df825e99a23f\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.849858 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-config-data\") pod \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.849913 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-credential-keys\") pod \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\" (UID: \"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e\") " Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.850031 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-config-data\") pod \"be3abc10-848a-448b-a2a2-df825e99a23f\" (UID: \"be3abc10-848a-448b-a2a2-df825e99a23f\") " Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.855472 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be3abc10-848a-448b-a2a2-df825e99a23f-logs" (OuterVolumeSpecName: "logs") pod "be3abc10-848a-448b-a2a2-df825e99a23f" (UID: "be3abc10-848a-448b-a2a2-df825e99a23f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.871543 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e1106ff1-8d0e-4e9f-ba62-d6a443279e0e" (UID: "e1106ff1-8d0e-4e9f-ba62-d6a443279e0e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.871623 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-scripts" (OuterVolumeSpecName: "scripts") pod "e1106ff1-8d0e-4e9f-ba62-d6a443279e0e" (UID: "e1106ff1-8d0e-4e9f-ba62-d6a443279e0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.873347 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e1106ff1-8d0e-4e9f-ba62-d6a443279e0e" (UID: "e1106ff1-8d0e-4e9f-ba62-d6a443279e0e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.873375 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be3abc10-848a-448b-a2a2-df825e99a23f-kube-api-access-dqwz5" (OuterVolumeSpecName: "kube-api-access-dqwz5") pod "be3abc10-848a-448b-a2a2-df825e99a23f" (UID: "be3abc10-848a-448b-a2a2-df825e99a23f"). InnerVolumeSpecName "kube-api-access-dqwz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.873436 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-kube-api-access-b8kqb" (OuterVolumeSpecName: "kube-api-access-b8kqb") pod "e1106ff1-8d0e-4e9f-ba62-d6a443279e0e" (UID: "e1106ff1-8d0e-4e9f-ba62-d6a443279e0e"). InnerVolumeSpecName "kube-api-access-b8kqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.876288 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-scripts" (OuterVolumeSpecName: "scripts") pod "be3abc10-848a-448b-a2a2-df825e99a23f" (UID: "be3abc10-848a-448b-a2a2-df825e99a23f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.880356 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-config-data" (OuterVolumeSpecName: "config-data") pod "e1106ff1-8d0e-4e9f-ba62-d6a443279e0e" (UID: "e1106ff1-8d0e-4e9f-ba62-d6a443279e0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.884350 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-config-data" (OuterVolumeSpecName: "config-data") pod "be3abc10-848a-448b-a2a2-df825e99a23f" (UID: "be3abc10-848a-448b-a2a2-df825e99a23f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.901699 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be3abc10-848a-448b-a2a2-df825e99a23f" (UID: "be3abc10-848a-448b-a2a2-df825e99a23f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.903906 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1106ff1-8d0e-4e9f-ba62-d6a443279e0e" (UID: "e1106ff1-8d0e-4e9f-ba62-d6a443279e0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.938646 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.952530 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqwz5\" (UniqueName: \"kubernetes.io/projected/be3abc10-848a-448b-a2a2-df825e99a23f-kube-api-access-dqwz5\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.952556 4783 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.952569 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.952579 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.952587 4783 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.952596 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.952603 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be3abc10-848a-448b-a2a2-df825e99a23f-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.952614 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8kqb\" (UniqueName: \"kubernetes.io/projected/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-kube-api-access-b8kqb\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.952622 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.952630 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be3abc10-848a-448b-a2a2-df825e99a23f-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:06 crc kubenswrapper[4783]: I0131 09:19:06.952637 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.053890 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-dns-swift-storage-0\") pod \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.054289 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-config\") pod \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.054568 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-dns-svc\") pod \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.054597 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-ovsdbserver-sb\") pod \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.054624 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxznq\" (UniqueName: \"kubernetes.io/projected/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-kube-api-access-sxznq\") pod \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.054656 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-ovsdbserver-nb\") pod \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\" (UID: \"e28f3ed6-b41b-46f2-822a-0a7a1c79e512\") " Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.084560 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-kube-api-access-sxznq" (OuterVolumeSpecName: "kube-api-access-sxznq") pod "e28f3ed6-b41b-46f2-822a-0a7a1c79e512" (UID: "e28f3ed6-b41b-46f2-822a-0a7a1c79e512"). InnerVolumeSpecName "kube-api-access-sxznq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.105754 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-config" (OuterVolumeSpecName: "config") pod "e28f3ed6-b41b-46f2-822a-0a7a1c79e512" (UID: "e28f3ed6-b41b-46f2-822a-0a7a1c79e512"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.110232 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e28f3ed6-b41b-46f2-822a-0a7a1c79e512" (UID: "e28f3ed6-b41b-46f2-822a-0a7a1c79e512"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.110548 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e28f3ed6-b41b-46f2-822a-0a7a1c79e512" (UID: "e28f3ed6-b41b-46f2-822a-0a7a1c79e512"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.114996 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e28f3ed6-b41b-46f2-822a-0a7a1c79e512" (UID: "e28f3ed6-b41b-46f2-822a-0a7a1c79e512"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.115517 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e28f3ed6-b41b-46f2-822a-0a7a1c79e512" (UID: "e28f3ed6-b41b-46f2-822a-0a7a1c79e512"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.157796 4783 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.157832 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.157847 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxznq\" (UniqueName: \"kubernetes.io/projected/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-kube-api-access-sxznq\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.157859 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.157869 4783 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.157878 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e28f3ed6-b41b-46f2-822a-0a7a1c79e512-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.243907 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gksrm" event={"ID":"e1106ff1-8d0e-4e9f-ba62-d6a443279e0e","Type":"ContainerDied","Data":"7709077033b7c821ebcb6003b18f1573407de5069371e3cdfbcd1f089c710269"} Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.244004 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7709077033b7c821ebcb6003b18f1573407de5069371e3cdfbcd1f089c710269" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.244126 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gksrm" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.253432 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-c5vpg" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.253459 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-c5vpg" event={"ID":"be3abc10-848a-448b-a2a2-df825e99a23f","Type":"ContainerDied","Data":"492f2e8a4078eb9777859fe393372cb93070e5f17568cf3d7d3918b8ae88c4a9"} Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.253633 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="492f2e8a4078eb9777859fe393372cb93070e5f17568cf3d7d3918b8ae88c4a9" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.260052 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e","Type":"ContainerStarted","Data":"9bf9dec205d42c2fdaeed84eec56a1c8d8c327c7e74b9351e877e25aa1eea0b2"} Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.266128 4783 generic.go:334] "Generic (PLEG): container finished" podID="e28f3ed6-b41b-46f2-822a-0a7a1c79e512" containerID="b131629e024206f4c4bfc74fbc7f6105d6719304dd28176ca7b139ca386afaae" exitCode=0 Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.266254 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.266389 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" event={"ID":"e28f3ed6-b41b-46f2-822a-0a7a1c79e512","Type":"ContainerDied","Data":"b131629e024206f4c4bfc74fbc7f6105d6719304dd28176ca7b139ca386afaae"} Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.266495 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-tlndp" event={"ID":"e28f3ed6-b41b-46f2-822a-0a7a1c79e512","Type":"ContainerDied","Data":"15e54d73de0e6da76d7852766a4374ead716adb651411288c63c4d30fb7ef116"} Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.266560 4783 scope.go:117] "RemoveContainer" containerID="b131629e024206f4c4bfc74fbc7f6105d6719304dd28176ca7b139ca386afaae" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.292688 4783 scope.go:117] "RemoveContainer" containerID="6774532024d3b31efd144b298c3951c8857ccbe48ce6563e6c761a4dcb09b1c5" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.306986 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-tlndp"] Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.314373 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-tlndp"] Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.327466 4783 scope.go:117] "RemoveContainer" containerID="b131629e024206f4c4bfc74fbc7f6105d6719304dd28176ca7b139ca386afaae" Jan 31 09:19:07 crc kubenswrapper[4783]: E0131 09:19:07.327958 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b131629e024206f4c4bfc74fbc7f6105d6719304dd28176ca7b139ca386afaae\": container with ID starting with b131629e024206f4c4bfc74fbc7f6105d6719304dd28176ca7b139ca386afaae not found: ID does not exist" containerID="b131629e024206f4c4bfc74fbc7f6105d6719304dd28176ca7b139ca386afaae" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.328005 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b131629e024206f4c4bfc74fbc7f6105d6719304dd28176ca7b139ca386afaae"} err="failed to get container status \"b131629e024206f4c4bfc74fbc7f6105d6719304dd28176ca7b139ca386afaae\": rpc error: code = NotFound desc = could not find container \"b131629e024206f4c4bfc74fbc7f6105d6719304dd28176ca7b139ca386afaae\": container with ID starting with b131629e024206f4c4bfc74fbc7f6105d6719304dd28176ca7b139ca386afaae not found: ID does not exist" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.328039 4783 scope.go:117] "RemoveContainer" containerID="6774532024d3b31efd144b298c3951c8857ccbe48ce6563e6c761a4dcb09b1c5" Jan 31 09:19:07 crc kubenswrapper[4783]: E0131 09:19:07.328488 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6774532024d3b31efd144b298c3951c8857ccbe48ce6563e6c761a4dcb09b1c5\": container with ID starting with 6774532024d3b31efd144b298c3951c8857ccbe48ce6563e6c761a4dcb09b1c5 not found: ID does not exist" containerID="6774532024d3b31efd144b298c3951c8857ccbe48ce6563e6c761a4dcb09b1c5" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.328525 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6774532024d3b31efd144b298c3951c8857ccbe48ce6563e6c761a4dcb09b1c5"} err="failed to get container status \"6774532024d3b31efd144b298c3951c8857ccbe48ce6563e6c761a4dcb09b1c5\": rpc error: code = NotFound desc = could not find container \"6774532024d3b31efd144b298c3951c8857ccbe48ce6563e6c761a4dcb09b1c5\": container with ID starting with 6774532024d3b31efd144b298c3951c8857ccbe48ce6563e6c761a4dcb09b1c5 not found: ID does not exist" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.411430 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.411492 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.453767 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.496608 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.668064 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e28f3ed6-b41b-46f2-822a-0a7a1c79e512" path="/var/lib/kubelet/pods/e28f3ed6-b41b-46f2-822a-0a7a1c79e512/volumes" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.858585 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-688b79757c-l8xjk"] Jan 31 09:19:07 crc kubenswrapper[4783]: E0131 09:19:07.858992 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be3abc10-848a-448b-a2a2-df825e99a23f" containerName="placement-db-sync" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.859008 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3abc10-848a-448b-a2a2-df825e99a23f" containerName="placement-db-sync" Jan 31 09:19:07 crc kubenswrapper[4783]: E0131 09:19:07.859027 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28f3ed6-b41b-46f2-822a-0a7a1c79e512" containerName="init" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.859033 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28f3ed6-b41b-46f2-822a-0a7a1c79e512" containerName="init" Jan 31 09:19:07 crc kubenswrapper[4783]: E0131 09:19:07.859060 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28f3ed6-b41b-46f2-822a-0a7a1c79e512" containerName="dnsmasq-dns" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.859066 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28f3ed6-b41b-46f2-822a-0a7a1c79e512" containerName="dnsmasq-dns" Jan 31 09:19:07 crc kubenswrapper[4783]: E0131 09:19:07.859084 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1106ff1-8d0e-4e9f-ba62-d6a443279e0e" containerName="keystone-bootstrap" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.859089 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1106ff1-8d0e-4e9f-ba62-d6a443279e0e" containerName="keystone-bootstrap" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.859282 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1106ff1-8d0e-4e9f-ba62-d6a443279e0e" containerName="keystone-bootstrap" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.859304 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="e28f3ed6-b41b-46f2-822a-0a7a1c79e512" containerName="dnsmasq-dns" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.859319 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="be3abc10-848a-448b-a2a2-df825e99a23f" containerName="placement-db-sync" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.859953 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.863019 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.863382 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.863558 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.868539 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wzwfs" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.868597 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.868751 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.874059 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-688b79757c-l8xjk"] Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.947945 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-67b45945f8-49g2r"] Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.949506 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.951293 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.951577 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.954451 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.954607 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.954728 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-lj2nd" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.970046 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67b45945f8-49g2r"] Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.986499 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-combined-ca-bundle\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.986555 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-config-data\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.986616 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-scripts\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.986637 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-internal-tls-certs\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.986660 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-credential-keys\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.986698 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njktn\" (UniqueName: \"kubernetes.io/projected/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-kube-api-access-njktn\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.986720 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-public-tls-certs\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:07 crc kubenswrapper[4783]: I0131 09:19:07.986739 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-fernet-keys\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.088411 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16e8a688-eb42-4c2b-a253-94f7ca54a51c-logs\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.088498 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-combined-ca-bundle\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.088550 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-config-data\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.088606 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-scripts\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.088642 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8v8k\" (UniqueName: \"kubernetes.io/projected/16e8a688-eb42-4c2b-a253-94f7ca54a51c-kube-api-access-v8v8k\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.088679 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-internal-tls-certs\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.088759 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-scripts\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.088793 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-internal-tls-certs\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.088828 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-credential-keys\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.088857 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-public-tls-certs\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.088925 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njktn\" (UniqueName: \"kubernetes.io/projected/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-kube-api-access-njktn\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.089020 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-public-tls-certs\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.089079 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-fernet-keys\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.089115 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-combined-ca-bundle\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.089153 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-config-data\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.094743 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-fernet-keys\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.095052 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-credential-keys\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.095409 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-scripts\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.095866 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-public-tls-certs\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.096946 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-config-data\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.105384 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-combined-ca-bundle\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.110390 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njktn\" (UniqueName: \"kubernetes.io/projected/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-kube-api-access-njktn\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.111645 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae-internal-tls-certs\") pod \"keystone-688b79757c-l8xjk\" (UID: \"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae\") " pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.163724 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5df45c7c98-nt6z5"] Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.165395 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.177017 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.191031 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-scripts\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.191073 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8v8k\" (UniqueName: \"kubernetes.io/projected/16e8a688-eb42-4c2b-a253-94f7ca54a51c-kube-api-access-v8v8k\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.191098 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-internal-tls-certs\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.191142 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-public-tls-certs\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.191204 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-combined-ca-bundle\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.191229 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-config-data\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.191254 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16e8a688-eb42-4c2b-a253-94f7ca54a51c-logs\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.191613 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16e8a688-eb42-4c2b-a253-94f7ca54a51c-logs\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.192796 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5df45c7c98-nt6z5"] Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.195242 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-scripts\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.197629 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-public-tls-certs\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.201146 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-internal-tls-certs\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.203103 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-combined-ca-bundle\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.217670 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-config-data\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.223895 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8v8k\" (UniqueName: \"kubernetes.io/projected/16e8a688-eb42-4c2b-a253-94f7ca54a51c-kube-api-access-v8v8k\") pod \"placement-67b45945f8-49g2r\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.271835 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.279765 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.279809 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.296335 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05962991-82e5-4c31-87fa-c7df3cba5f90-scripts\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.296549 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk6cc\" (UniqueName: \"kubernetes.io/projected/05962991-82e5-4c31-87fa-c7df3cba5f90-kube-api-access-zk6cc\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.296572 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05962991-82e5-4c31-87fa-c7df3cba5f90-logs\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.296625 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05962991-82e5-4c31-87fa-c7df3cba5f90-config-data\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.296640 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05962991-82e5-4c31-87fa-c7df3cba5f90-public-tls-certs\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.296694 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05962991-82e5-4c31-87fa-c7df3cba5f90-combined-ca-bundle\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.296722 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05962991-82e5-4c31-87fa-c7df3cba5f90-internal-tls-certs\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.401125 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05962991-82e5-4c31-87fa-c7df3cba5f90-combined-ca-bundle\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.401258 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05962991-82e5-4c31-87fa-c7df3cba5f90-internal-tls-certs\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.401337 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05962991-82e5-4c31-87fa-c7df3cba5f90-scripts\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.401408 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk6cc\" (UniqueName: \"kubernetes.io/projected/05962991-82e5-4c31-87fa-c7df3cba5f90-kube-api-access-zk6cc\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.401453 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05962991-82e5-4c31-87fa-c7df3cba5f90-logs\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.401593 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05962991-82e5-4c31-87fa-c7df3cba5f90-public-tls-certs\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.401618 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05962991-82e5-4c31-87fa-c7df3cba5f90-config-data\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.404577 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05962991-82e5-4c31-87fa-c7df3cba5f90-logs\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.414571 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05962991-82e5-4c31-87fa-c7df3cba5f90-public-tls-certs\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.416397 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05962991-82e5-4c31-87fa-c7df3cba5f90-scripts\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.420837 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05962991-82e5-4c31-87fa-c7df3cba5f90-combined-ca-bundle\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.434033 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05962991-82e5-4c31-87fa-c7df3cba5f90-internal-tls-certs\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.436772 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk6cc\" (UniqueName: \"kubernetes.io/projected/05962991-82e5-4c31-87fa-c7df3cba5f90-kube-api-access-zk6cc\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.439527 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05962991-82e5-4c31-87fa-c7df3cba5f90-config-data\") pod \"placement-5df45c7c98-nt6z5\" (UID: \"05962991-82e5-4c31-87fa-c7df3cba5f90\") " pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.565496 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.724745 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-688b79757c-l8xjk"] Jan 31 09:19:08 crc kubenswrapper[4783]: I0131 09:19:08.824642 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67b45945f8-49g2r"] Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.251854 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5df45c7c98-nt6z5"] Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.282201 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vxv6q"] Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.284045 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.285357 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vxv6q"] Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.348061 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-688b79757c-l8xjk" event={"ID":"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae","Type":"ContainerStarted","Data":"95381446b8c2d8ccedbae60457336fa40381a5150a97a61f771ce726a6a05c3f"} Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.348224 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-688b79757c-l8xjk" event={"ID":"5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae","Type":"ContainerStarted","Data":"b3ad7e409a55be5cdd6c598fdde521798a9b8f6df6162d186dae619e7a08df39"} Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.349224 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.364392 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b45945f8-49g2r" event={"ID":"16e8a688-eb42-4c2b-a253-94f7ca54a51c","Type":"ContainerStarted","Data":"4282654d6b08dc89cac0b34ad107abd6b2a91d2ed7a800a188b8f034e39e90b1"} Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.364447 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b45945f8-49g2r" event={"ID":"16e8a688-eb42-4c2b-a253-94f7ca54a51c","Type":"ContainerStarted","Data":"737f3858aa52343ae55b746f16f2d67a6b9d6780e21521c93d87bb5310a3bb43"} Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.373713 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-688b79757c-l8xjk" podStartSLOduration=2.3737040990000002 podStartE2EDuration="2.373704099s" podCreationTimestamp="2026-01-31 09:19:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:09.368326923 +0000 UTC m=+860.037010381" watchObservedRunningTime="2026-01-31 09:19:09.373704099 +0000 UTC m=+860.042387567" Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.380117 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5df45c7c98-nt6z5" event={"ID":"05962991-82e5-4c31-87fa-c7df3cba5f90","Type":"ContainerStarted","Data":"f73d5976e4c6a4474bff0da5dcf44abc765ae3d6eab81a2cd92d15f72efbf4cc"} Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.458392 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-697wt\" (UniqueName: \"kubernetes.io/projected/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-kube-api-access-697wt\") pod \"redhat-marketplace-vxv6q\" (UID: \"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7\") " pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.458493 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-utilities\") pod \"redhat-marketplace-vxv6q\" (UID: \"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7\") " pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.458517 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-catalog-content\") pod \"redhat-marketplace-vxv6q\" (UID: \"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7\") " pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.560959 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-697wt\" (UniqueName: \"kubernetes.io/projected/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-kube-api-access-697wt\") pod \"redhat-marketplace-vxv6q\" (UID: \"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7\") " pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.561274 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-utilities\") pod \"redhat-marketplace-vxv6q\" (UID: \"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7\") " pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.561359 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-catalog-content\") pod \"redhat-marketplace-vxv6q\" (UID: \"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7\") " pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.563642 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-utilities\") pod \"redhat-marketplace-vxv6q\" (UID: \"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7\") " pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.564413 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-catalog-content\") pod \"redhat-marketplace-vxv6q\" (UID: \"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7\") " pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.586565 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-697wt\" (UniqueName: \"kubernetes.io/projected/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-kube-api-access-697wt\") pod \"redhat-marketplace-vxv6q\" (UID: \"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7\") " pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:09 crc kubenswrapper[4783]: I0131 09:19:09.616125 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:10 crc kubenswrapper[4783]: E0131 09:19:10.165070 4783 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode52e03f3_fa41_487d_affe_89222406f4bb.slice/crio-84fdab98a2affa05717215feb4e4383412a44843ea778db228bbfc3433d763c6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca7e6f7a_4b59_42fd_9ef2_4f761e2d0af9.slice\": RecentStats: unable to find data in memory cache]" Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.232538 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vxv6q"] Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.293466 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.293515 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.338696 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.350966 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.406043 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b45945f8-49g2r" event={"ID":"16e8a688-eb42-4c2b-a253-94f7ca54a51c","Type":"ContainerStarted","Data":"04e9e8f69d87b4456bbb9e9898b32975f10e33346889f421c328ea108ddb16bb"} Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.406314 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.406353 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.411679 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5df45c7c98-nt6z5" event={"ID":"05962991-82e5-4c31-87fa-c7df3cba5f90","Type":"ContainerStarted","Data":"522b6800ac273d84025954ebbd43882973bddc62a4601552370c1681701de2f6"} Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.411722 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5df45c7c98-nt6z5" event={"ID":"05962991-82e5-4c31-87fa-c7df3cba5f90","Type":"ContainerStarted","Data":"5478118f03092c1c5be54e4694168068ff29665c0b80e71d9f01d3a94b79e074"} Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.412253 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.412365 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.416041 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vxv6q" event={"ID":"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7","Type":"ContainerStarted","Data":"a6a89c35eb100b12139c8a66a94e0e405fc063ba66570d656b0a34a375689680"} Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.416966 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.417611 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.442507 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-67b45945f8-49g2r" podStartSLOduration=3.442488867 podStartE2EDuration="3.442488867s" podCreationTimestamp="2026-01-31 09:19:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:10.425788905 +0000 UTC m=+861.094472374" watchObservedRunningTime="2026-01-31 09:19:10.442488867 +0000 UTC m=+861.111172335" Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.500271 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.500383 4783 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.523676 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5df45c7c98-nt6z5" podStartSLOduration=2.523657038 podStartE2EDuration="2.523657038s" podCreationTimestamp="2026-01-31 09:19:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:10.462816787 +0000 UTC m=+861.131500265" watchObservedRunningTime="2026-01-31 09:19:10.523657038 +0000 UTC m=+861.192340507" Jan 31 09:19:10 crc kubenswrapper[4783]: I0131 09:19:10.922717 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 31 09:19:11 crc kubenswrapper[4783]: I0131 09:19:11.433318 4783 generic.go:334] "Generic (PLEG): container finished" podID="43fedf89-07db-4c75-a8c3-9f04bbc9c6f7" containerID="d98bf55b9245bf97f7d7575fec0f8ca9e0b1e81088b011780ac993458e19cdc3" exitCode=0 Jan 31 09:19:11 crc kubenswrapper[4783]: I0131 09:19:11.433546 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vxv6q" event={"ID":"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7","Type":"ContainerDied","Data":"d98bf55b9245bf97f7d7575fec0f8ca9e0b1e81088b011780ac993458e19cdc3"} Jan 31 09:19:12 crc kubenswrapper[4783]: I0131 09:19:12.219043 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 31 09:19:12 crc kubenswrapper[4783]: I0131 09:19:12.222741 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 31 09:19:12 crc kubenswrapper[4783]: I0131 09:19:12.466534 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vxv6q" event={"ID":"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7","Type":"ContainerStarted","Data":"656e640701a9b950a553e031f4f33c1c42756be10e2b9a83921072ba8d7687a0"} Jan 31 09:19:12 crc kubenswrapper[4783]: I0131 09:19:12.492402 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6c24v" event={"ID":"6275a243-2cfc-4f77-a4b5-40a697e309d9","Type":"ContainerStarted","Data":"4a25d7ae9d70db3cf6369f78fc4d5132e8bdc523ca1308ad2b307efc5fa7e3f6"} Jan 31 09:19:12 crc kubenswrapper[4783]: I0131 09:19:12.778224 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f5ff596f4-ffmss" podUID="8dda3593-0628-4253-995b-b662d252462e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 31 09:19:12 crc kubenswrapper[4783]: I0131 09:19:12.870759 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6644bf8978-q24zg" podUID="940e2e96-d6a1-4576-b83a-e30ff1f6ab85" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Jan 31 09:19:13 crc kubenswrapper[4783]: I0131 09:19:13.503721 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k2pw7" event={"ID":"fd0c0937-6461-4221-bbd8-3c7e37bbff9d","Type":"ContainerStarted","Data":"4afa1e0517eca9bcff31824e306355561432ec3e2e411eb39b30ad844d68d49e"} Jan 31 09:19:13 crc kubenswrapper[4783]: I0131 09:19:13.509402 4783 generic.go:334] "Generic (PLEG): container finished" podID="43fedf89-07db-4c75-a8c3-9f04bbc9c6f7" containerID="656e640701a9b950a553e031f4f33c1c42756be10e2b9a83921072ba8d7687a0" exitCode=0 Jan 31 09:19:13 crc kubenswrapper[4783]: I0131 09:19:13.509911 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vxv6q" event={"ID":"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7","Type":"ContainerDied","Data":"656e640701a9b950a553e031f4f33c1c42756be10e2b9a83921072ba8d7687a0"} Jan 31 09:19:13 crc kubenswrapper[4783]: I0131 09:19:13.524445 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-6c24v" podStartSLOduration=3.956182342 podStartE2EDuration="39.524413153s" podCreationTimestamp="2026-01-31 09:18:34 +0000 UTC" firstStartedPulling="2026-01-31 09:18:35.621749758 +0000 UTC m=+826.290433225" lastFinishedPulling="2026-01-31 09:19:11.189980578 +0000 UTC m=+861.858664036" observedRunningTime="2026-01-31 09:19:12.51947279 +0000 UTC m=+863.188156258" watchObservedRunningTime="2026-01-31 09:19:13.524413153 +0000 UTC m=+864.193096622" Jan 31 09:19:13 crc kubenswrapper[4783]: I0131 09:19:13.536514 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-k2pw7" podStartSLOduration=2.932041233 podStartE2EDuration="39.536497133s" podCreationTimestamp="2026-01-31 09:18:34 +0000 UTC" firstStartedPulling="2026-01-31 09:18:35.803838075 +0000 UTC m=+826.472521543" lastFinishedPulling="2026-01-31 09:19:12.408293975 +0000 UTC m=+863.076977443" observedRunningTime="2026-01-31 09:19:13.514941669 +0000 UTC m=+864.183625137" watchObservedRunningTime="2026-01-31 09:19:13.536497133 +0000 UTC m=+864.205180601" Jan 31 09:19:14 crc kubenswrapper[4783]: I0131 09:19:14.523446 4783 generic.go:334] "Generic (PLEG): container finished" podID="fd0c0937-6461-4221-bbd8-3c7e37bbff9d" containerID="4afa1e0517eca9bcff31824e306355561432ec3e2e411eb39b30ad844d68d49e" exitCode=0 Jan 31 09:19:14 crc kubenswrapper[4783]: I0131 09:19:14.524056 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k2pw7" event={"ID":"fd0c0937-6461-4221-bbd8-3c7e37bbff9d","Type":"ContainerDied","Data":"4afa1e0517eca9bcff31824e306355561432ec3e2e411eb39b30ad844d68d49e"} Jan 31 09:19:15 crc kubenswrapper[4783]: I0131 09:19:15.537526 4783 generic.go:334] "Generic (PLEG): container finished" podID="6275a243-2cfc-4f77-a4b5-40a697e309d9" containerID="4a25d7ae9d70db3cf6369f78fc4d5132e8bdc523ca1308ad2b307efc5fa7e3f6" exitCode=0 Jan 31 09:19:15 crc kubenswrapper[4783]: I0131 09:19:15.537816 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6c24v" event={"ID":"6275a243-2cfc-4f77-a4b5-40a697e309d9","Type":"ContainerDied","Data":"4a25d7ae9d70db3cf6369f78fc4d5132e8bdc523ca1308ad2b307efc5fa7e3f6"} Jan 31 09:19:15 crc kubenswrapper[4783]: I0131 09:19:15.666828 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7k8d2"] Jan 31 09:19:15 crc kubenswrapper[4783]: I0131 09:19:15.668552 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7k8d2"] Jan 31 09:19:15 crc kubenswrapper[4783]: I0131 09:19:15.668640 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:15 crc kubenswrapper[4783]: I0131 09:19:15.749117 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdk6z\" (UniqueName: \"kubernetes.io/projected/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-kube-api-access-sdk6z\") pod \"certified-operators-7k8d2\" (UID: \"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c\") " pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:15 crc kubenswrapper[4783]: I0131 09:19:15.749445 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-utilities\") pod \"certified-operators-7k8d2\" (UID: \"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c\") " pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:15 crc kubenswrapper[4783]: I0131 09:19:15.749530 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-catalog-content\") pod \"certified-operators-7k8d2\" (UID: \"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c\") " pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:15 crc kubenswrapper[4783]: I0131 09:19:15.851265 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdk6z\" (UniqueName: \"kubernetes.io/projected/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-kube-api-access-sdk6z\") pod \"certified-operators-7k8d2\" (UID: \"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c\") " pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:15 crc kubenswrapper[4783]: I0131 09:19:15.851307 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-utilities\") pod \"certified-operators-7k8d2\" (UID: \"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c\") " pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:15 crc kubenswrapper[4783]: I0131 09:19:15.851389 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-catalog-content\") pod \"certified-operators-7k8d2\" (UID: \"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c\") " pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:15 crc kubenswrapper[4783]: I0131 09:19:15.852079 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-catalog-content\") pod \"certified-operators-7k8d2\" (UID: \"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c\") " pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:15 crc kubenswrapper[4783]: I0131 09:19:15.852110 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-utilities\") pod \"certified-operators-7k8d2\" (UID: \"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c\") " pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:15 crc kubenswrapper[4783]: I0131 09:19:15.872014 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdk6z\" (UniqueName: \"kubernetes.io/projected/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-kube-api-access-sdk6z\") pod \"certified-operators-7k8d2\" (UID: \"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c\") " pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:15 crc kubenswrapper[4783]: I0131 09:19:15.996830 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.170774 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k2pw7" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.361707 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-combined-ca-bundle\") pod \"fd0c0937-6461-4221-bbd8-3c7e37bbff9d\" (UID: \"fd0c0937-6461-4221-bbd8-3c7e37bbff9d\") " Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.361760 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sqcn\" (UniqueName: \"kubernetes.io/projected/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-kube-api-access-9sqcn\") pod \"fd0c0937-6461-4221-bbd8-3c7e37bbff9d\" (UID: \"fd0c0937-6461-4221-bbd8-3c7e37bbff9d\") " Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.362095 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-db-sync-config-data\") pod \"fd0c0937-6461-4221-bbd8-3c7e37bbff9d\" (UID: \"fd0c0937-6461-4221-bbd8-3c7e37bbff9d\") " Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.368570 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-kube-api-access-9sqcn" (OuterVolumeSpecName: "kube-api-access-9sqcn") pod "fd0c0937-6461-4221-bbd8-3c7e37bbff9d" (UID: "fd0c0937-6461-4221-bbd8-3c7e37bbff9d"). InnerVolumeSpecName "kube-api-access-9sqcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.368727 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fd0c0937-6461-4221-bbd8-3c7e37bbff9d" (UID: "fd0c0937-6461-4221-bbd8-3c7e37bbff9d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.426799 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd0c0937-6461-4221-bbd8-3c7e37bbff9d" (UID: "fd0c0937-6461-4221-bbd8-3c7e37bbff9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.465198 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.465224 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sqcn\" (UniqueName: \"kubernetes.io/projected/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-kube-api-access-9sqcn\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.465236 4783 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fd0c0937-6461-4221-bbd8-3c7e37bbff9d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.556930 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k2pw7" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.556926 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k2pw7" event={"ID":"fd0c0937-6461-4221-bbd8-3c7e37bbff9d","Type":"ContainerDied","Data":"b1bea4ce8f9c74d70b70c2cf7edb881af63d164602649430e59eb6637119de81"} Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.557572 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1bea4ce8f9c74d70b70c2cf7edb881af63d164602649430e59eb6637119de81" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.761812 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7c946647fc-lsk5p"] Jan 31 09:19:16 crc kubenswrapper[4783]: E0131 09:19:16.765954 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0c0937-6461-4221-bbd8-3c7e37bbff9d" containerName="barbican-db-sync" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.765972 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0c0937-6461-4221-bbd8-3c7e37bbff9d" containerName="barbican-db-sync" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.766177 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd0c0937-6461-4221-bbd8-3c7e37bbff9d" containerName="barbican-db-sync" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.767018 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.774616 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c946647fc-lsk5p"] Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.774641 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-sqzmk" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.774686 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.774722 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.840210 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5cf88c75b6-glzzx"] Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.841684 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.852548 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.879821 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5cf88c75b6-glzzx"] Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.883253 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814c0372-f441-4fce-b7d3-47827597fdd5-combined-ca-bundle\") pod \"barbican-worker-7c946647fc-lsk5p\" (UID: \"814c0372-f441-4fce-b7d3-47827597fdd5\") " pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.883301 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd5ec39-81e0-44cd-b99f-01e3d301b192-combined-ca-bundle\") pod \"barbican-keystone-listener-5cf88c75b6-glzzx\" (UID: \"3cd5ec39-81e0-44cd-b99f-01e3d301b192\") " pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.886239 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3cd5ec39-81e0-44cd-b99f-01e3d301b192-config-data-custom\") pod \"barbican-keystone-listener-5cf88c75b6-glzzx\" (UID: \"3cd5ec39-81e0-44cd-b99f-01e3d301b192\") " pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.886302 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cd5ec39-81e0-44cd-b99f-01e3d301b192-config-data\") pod \"barbican-keystone-listener-5cf88c75b6-glzzx\" (UID: \"3cd5ec39-81e0-44cd-b99f-01e3d301b192\") " pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.886331 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jqgd\" (UniqueName: \"kubernetes.io/projected/814c0372-f441-4fce-b7d3-47827597fdd5-kube-api-access-6jqgd\") pod \"barbican-worker-7c946647fc-lsk5p\" (UID: \"814c0372-f441-4fce-b7d3-47827597fdd5\") " pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.886364 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/814c0372-f441-4fce-b7d3-47827597fdd5-config-data-custom\") pod \"barbican-worker-7c946647fc-lsk5p\" (UID: \"814c0372-f441-4fce-b7d3-47827597fdd5\") " pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.886390 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/814c0372-f441-4fce-b7d3-47827597fdd5-logs\") pod \"barbican-worker-7c946647fc-lsk5p\" (UID: \"814c0372-f441-4fce-b7d3-47827597fdd5\") " pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.886408 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clzdv\" (UniqueName: \"kubernetes.io/projected/3cd5ec39-81e0-44cd-b99f-01e3d301b192-kube-api-access-clzdv\") pod \"barbican-keystone-listener-5cf88c75b6-glzzx\" (UID: \"3cd5ec39-81e0-44cd-b99f-01e3d301b192\") " pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.886425 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814c0372-f441-4fce-b7d3-47827597fdd5-config-data\") pod \"barbican-worker-7c946647fc-lsk5p\" (UID: \"814c0372-f441-4fce-b7d3-47827597fdd5\") " pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.886459 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cd5ec39-81e0-44cd-b99f-01e3d301b192-logs\") pod \"barbican-keystone-listener-5cf88c75b6-glzzx\" (UID: \"3cd5ec39-81e0-44cd-b99f-01e3d301b192\") " pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.952219 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-7h727"] Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.955384 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.989815 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814c0372-f441-4fce-b7d3-47827597fdd5-combined-ca-bundle\") pod \"barbican-worker-7c946647fc-lsk5p\" (UID: \"814c0372-f441-4fce-b7d3-47827597fdd5\") " pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.989862 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd5ec39-81e0-44cd-b99f-01e3d301b192-combined-ca-bundle\") pod \"barbican-keystone-listener-5cf88c75b6-glzzx\" (UID: \"3cd5ec39-81e0-44cd-b99f-01e3d301b192\") " pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.989939 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3cd5ec39-81e0-44cd-b99f-01e3d301b192-config-data-custom\") pod \"barbican-keystone-listener-5cf88c75b6-glzzx\" (UID: \"3cd5ec39-81e0-44cd-b99f-01e3d301b192\") " pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.989977 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cd5ec39-81e0-44cd-b99f-01e3d301b192-config-data\") pod \"barbican-keystone-listener-5cf88c75b6-glzzx\" (UID: \"3cd5ec39-81e0-44cd-b99f-01e3d301b192\") " pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.989998 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jqgd\" (UniqueName: \"kubernetes.io/projected/814c0372-f441-4fce-b7d3-47827597fdd5-kube-api-access-6jqgd\") pod \"barbican-worker-7c946647fc-lsk5p\" (UID: \"814c0372-f441-4fce-b7d3-47827597fdd5\") " pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.990021 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/814c0372-f441-4fce-b7d3-47827597fdd5-config-data-custom\") pod \"barbican-worker-7c946647fc-lsk5p\" (UID: \"814c0372-f441-4fce-b7d3-47827597fdd5\") " pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.990040 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814c0372-f441-4fce-b7d3-47827597fdd5-config-data\") pod \"barbican-worker-7c946647fc-lsk5p\" (UID: \"814c0372-f441-4fce-b7d3-47827597fdd5\") " pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.990058 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/814c0372-f441-4fce-b7d3-47827597fdd5-logs\") pod \"barbican-worker-7c946647fc-lsk5p\" (UID: \"814c0372-f441-4fce-b7d3-47827597fdd5\") " pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.990073 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clzdv\" (UniqueName: \"kubernetes.io/projected/3cd5ec39-81e0-44cd-b99f-01e3d301b192-kube-api-access-clzdv\") pod \"barbican-keystone-listener-5cf88c75b6-glzzx\" (UID: \"3cd5ec39-81e0-44cd-b99f-01e3d301b192\") " pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.990096 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cd5ec39-81e0-44cd-b99f-01e3d301b192-logs\") pod \"barbican-keystone-listener-5cf88c75b6-glzzx\" (UID: \"3cd5ec39-81e0-44cd-b99f-01e3d301b192\") " pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:16 crc kubenswrapper[4783]: I0131 09:19:16.990640 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cd5ec39-81e0-44cd-b99f-01e3d301b192-logs\") pod \"barbican-keystone-listener-5cf88c75b6-glzzx\" (UID: \"3cd5ec39-81e0-44cd-b99f-01e3d301b192\") " pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.001151 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814c0372-f441-4fce-b7d3-47827597fdd5-config-data\") pod \"barbican-worker-7c946647fc-lsk5p\" (UID: \"814c0372-f441-4fce-b7d3-47827597fdd5\") " pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.001235 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/814c0372-f441-4fce-b7d3-47827597fdd5-logs\") pod \"barbican-worker-7c946647fc-lsk5p\" (UID: \"814c0372-f441-4fce-b7d3-47827597fdd5\") " pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.005055 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd5ec39-81e0-44cd-b99f-01e3d301b192-combined-ca-bundle\") pod \"barbican-keystone-listener-5cf88c75b6-glzzx\" (UID: \"3cd5ec39-81e0-44cd-b99f-01e3d301b192\") " pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.009079 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/814c0372-f441-4fce-b7d3-47827597fdd5-config-data-custom\") pod \"barbican-worker-7c946647fc-lsk5p\" (UID: \"814c0372-f441-4fce-b7d3-47827597fdd5\") " pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.014628 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814c0372-f441-4fce-b7d3-47827597fdd5-combined-ca-bundle\") pod \"barbican-worker-7c946647fc-lsk5p\" (UID: \"814c0372-f441-4fce-b7d3-47827597fdd5\") " pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.028958 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3cd5ec39-81e0-44cd-b99f-01e3d301b192-config-data-custom\") pod \"barbican-keystone-listener-5cf88c75b6-glzzx\" (UID: \"3cd5ec39-81e0-44cd-b99f-01e3d301b192\") " pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.041887 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cd5ec39-81e0-44cd-b99f-01e3d301b192-config-data\") pod \"barbican-keystone-listener-5cf88c75b6-glzzx\" (UID: \"3cd5ec39-81e0-44cd-b99f-01e3d301b192\") " pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.042320 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clzdv\" (UniqueName: \"kubernetes.io/projected/3cd5ec39-81e0-44cd-b99f-01e3d301b192-kube-api-access-clzdv\") pod \"barbican-keystone-listener-5cf88c75b6-glzzx\" (UID: \"3cd5ec39-81e0-44cd-b99f-01e3d301b192\") " pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.045948 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jqgd\" (UniqueName: \"kubernetes.io/projected/814c0372-f441-4fce-b7d3-47827597fdd5-kube-api-access-6jqgd\") pod \"barbican-worker-7c946647fc-lsk5p\" (UID: \"814c0372-f441-4fce-b7d3-47827597fdd5\") " pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.046017 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-7h727"] Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.072267 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-676f7f866-qr2ck"] Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.074877 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.077582 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.094670 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-676f7f866-qr2ck"] Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.100474 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-dns-svc\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.100560 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lj8n\" (UniqueName: \"kubernetes.io/projected/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-kube-api-access-8lj8n\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.100682 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-config\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.100825 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-ovsdbserver-sb\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.100893 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-dns-swift-storage-0\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.100922 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-ovsdbserver-nb\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.140476 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c946647fc-lsk5p" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.203068 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-config\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.203143 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-config-data-custom\") pod \"barbican-api-676f7f866-qr2ck\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.203187 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtt49\" (UniqueName: \"kubernetes.io/projected/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-kube-api-access-qtt49\") pod \"barbican-api-676f7f866-qr2ck\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.203235 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-combined-ca-bundle\") pod \"barbican-api-676f7f866-qr2ck\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.203262 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-logs\") pod \"barbican-api-676f7f866-qr2ck\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.203377 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-ovsdbserver-sb\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.203452 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-dns-swift-storage-0\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.203478 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-ovsdbserver-nb\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.203534 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-config-data\") pod \"barbican-api-676f7f866-qr2ck\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.203581 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-dns-svc\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.203623 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lj8n\" (UniqueName: \"kubernetes.io/projected/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-kube-api-access-8lj8n\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.204713 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-ovsdbserver-sb\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.205149 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-dns-swift-storage-0\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.205257 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-ovsdbserver-nb\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.207869 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-dns-svc\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.208191 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-config\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.221925 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lj8n\" (UniqueName: \"kubernetes.io/projected/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-kube-api-access-8lj8n\") pod \"dnsmasq-dns-7bdf86f46f-7h727\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.226520 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.291301 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.304919 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-config-data-custom\") pod \"barbican-api-676f7f866-qr2ck\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.304957 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtt49\" (UniqueName: \"kubernetes.io/projected/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-kube-api-access-qtt49\") pod \"barbican-api-676f7f866-qr2ck\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.304991 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-combined-ca-bundle\") pod \"barbican-api-676f7f866-qr2ck\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.305010 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-logs\") pod \"barbican-api-676f7f866-qr2ck\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.305079 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-config-data\") pod \"barbican-api-676f7f866-qr2ck\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.305908 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-logs\") pod \"barbican-api-676f7f866-qr2ck\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.308719 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-config-data-custom\") pod \"barbican-api-676f7f866-qr2ck\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.309283 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-config-data\") pod \"barbican-api-676f7f866-qr2ck\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.318275 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-combined-ca-bundle\") pod \"barbican-api-676f7f866-qr2ck\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.320583 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtt49\" (UniqueName: \"kubernetes.io/projected/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-kube-api-access-qtt49\") pod \"barbican-api-676f7f866-qr2ck\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:17 crc kubenswrapper[4783]: I0131 09:19:17.442546 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.214455 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6c24v" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.325966 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-db-sync-config-data\") pod \"6275a243-2cfc-4f77-a4b5-40a697e309d9\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.326043 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-combined-ca-bundle\") pod \"6275a243-2cfc-4f77-a4b5-40a697e309d9\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.326064 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hss89\" (UniqueName: \"kubernetes.io/projected/6275a243-2cfc-4f77-a4b5-40a697e309d9-kube-api-access-hss89\") pod \"6275a243-2cfc-4f77-a4b5-40a697e309d9\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.326194 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6275a243-2cfc-4f77-a4b5-40a697e309d9-etc-machine-id\") pod \"6275a243-2cfc-4f77-a4b5-40a697e309d9\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.326226 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-config-data\") pod \"6275a243-2cfc-4f77-a4b5-40a697e309d9\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.326657 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-scripts\") pod \"6275a243-2cfc-4f77-a4b5-40a697e309d9\" (UID: \"6275a243-2cfc-4f77-a4b5-40a697e309d9\") " Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.327770 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6275a243-2cfc-4f77-a4b5-40a697e309d9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6275a243-2cfc-4f77-a4b5-40a697e309d9" (UID: "6275a243-2cfc-4f77-a4b5-40a697e309d9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.348311 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6275a243-2cfc-4f77-a4b5-40a697e309d9" (UID: "6275a243-2cfc-4f77-a4b5-40a697e309d9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.348583 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-scripts" (OuterVolumeSpecName: "scripts") pod "6275a243-2cfc-4f77-a4b5-40a697e309d9" (UID: "6275a243-2cfc-4f77-a4b5-40a697e309d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.350095 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6275a243-2cfc-4f77-a4b5-40a697e309d9-kube-api-access-hss89" (OuterVolumeSpecName: "kube-api-access-hss89") pod "6275a243-2cfc-4f77-a4b5-40a697e309d9" (UID: "6275a243-2cfc-4f77-a4b5-40a697e309d9"). InnerVolumeSpecName "kube-api-access-hss89". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.382482 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6275a243-2cfc-4f77-a4b5-40a697e309d9" (UID: "6275a243-2cfc-4f77-a4b5-40a697e309d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.427813 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-config-data" (OuterVolumeSpecName: "config-data") pod "6275a243-2cfc-4f77-a4b5-40a697e309d9" (UID: "6275a243-2cfc-4f77-a4b5-40a697e309d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.429061 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.429075 4783 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.429084 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.429092 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hss89\" (UniqueName: \"kubernetes.io/projected/6275a243-2cfc-4f77-a4b5-40a697e309d9-kube-api-access-hss89\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.429099 4783 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6275a243-2cfc-4f77-a4b5-40a697e309d9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.429106 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6275a243-2cfc-4f77-a4b5-40a697e309d9-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.585912 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6c24v" event={"ID":"6275a243-2cfc-4f77-a4b5-40a697e309d9","Type":"ContainerDied","Data":"b33d7436ddd08c683623148f9b35ba95a8238cd60e440a039df8f85844891792"} Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.585951 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b33d7436ddd08c683623148f9b35ba95a8238cd60e440a039df8f85844891792" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.586013 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6c24v" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.594097 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e","Type":"ContainerStarted","Data":"013c28206759b2989b31fbc7152bd1a5e01f77975e280fc08d079e88b518cc02"} Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.594271 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerName="ceilometer-central-agent" containerID="cri-o://3adbf3400e228d9066ac644318558c4c47c5c19173652a94f1aa77cf80a3adc6" gracePeriod=30 Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.594372 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.594920 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerName="proxy-httpd" containerID="cri-o://013c28206759b2989b31fbc7152bd1a5e01f77975e280fc08d079e88b518cc02" gracePeriod=30 Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.595072 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerName="sg-core" containerID="cri-o://9bf9dec205d42c2fdaeed84eec56a1c8d8c327c7e74b9351e877e25aa1eea0b2" gracePeriod=30 Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.595346 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerName="ceilometer-notification-agent" containerID="cri-o://adcab2008d1698aeceedc72c8095b7356b8d7c69ce11c1f9ac18f86d09ddbe7a" gracePeriod=30 Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.610542 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vxv6q" event={"ID":"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7","Type":"ContainerStarted","Data":"101b1e86217c8d0db3be0e3180dcaf21e282a96c3889e1987c56cbde4498d6f2"} Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.622964 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.254760866 podStartE2EDuration="44.622949817s" podCreationTimestamp="2026-01-31 09:18:34 +0000 UTC" firstStartedPulling="2026-01-31 09:18:35.809937459 +0000 UTC m=+826.478620928" lastFinishedPulling="2026-01-31 09:19:18.178126411 +0000 UTC m=+868.846809879" observedRunningTime="2026-01-31 09:19:18.619451011 +0000 UTC m=+869.288134479" watchObservedRunningTime="2026-01-31 09:19:18.622949817 +0000 UTC m=+869.291633285" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.637143 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vxv6q" podStartSLOduration=2.849874277 podStartE2EDuration="9.637130869s" podCreationTimestamp="2026-01-31 09:19:09 +0000 UTC" firstStartedPulling="2026-01-31 09:19:11.441877705 +0000 UTC m=+862.110561173" lastFinishedPulling="2026-01-31 09:19:18.229134297 +0000 UTC m=+868.897817765" observedRunningTime="2026-01-31 09:19:18.636408718 +0000 UTC m=+869.305092176" watchObservedRunningTime="2026-01-31 09:19:18.637130869 +0000 UTC m=+869.305814338" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.705428 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-676f7f866-qr2ck"] Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.720636 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c946647fc-lsk5p"] Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.726929 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7k8d2"] Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.857565 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dmfcr"] Jan 31 09:19:18 crc kubenswrapper[4783]: E0131 09:19:18.857945 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6275a243-2cfc-4f77-a4b5-40a697e309d9" containerName="cinder-db-sync" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.857962 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="6275a243-2cfc-4f77-a4b5-40a697e309d9" containerName="cinder-db-sync" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.858140 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="6275a243-2cfc-4f77-a4b5-40a697e309d9" containerName="cinder-db-sync" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.859350 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.894199 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5cf88c75b6-glzzx"] Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.904215 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-7h727"] Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.944555 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m47rv\" (UniqueName: \"kubernetes.io/projected/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-kube-api-access-m47rv\") pod \"redhat-operators-dmfcr\" (UID: \"f4f754dd-ba60-4f7b-b96b-ac8f2530250c\") " pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.944672 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-catalog-content\") pod \"redhat-operators-dmfcr\" (UID: \"f4f754dd-ba60-4f7b-b96b-ac8f2530250c\") " pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.944743 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-utilities\") pod \"redhat-operators-dmfcr\" (UID: \"f4f754dd-ba60-4f7b-b96b-ac8f2530250c\") " pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:18 crc kubenswrapper[4783]: I0131 09:19:18.945090 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dmfcr"] Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.048825 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m47rv\" (UniqueName: \"kubernetes.io/projected/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-kube-api-access-m47rv\") pod \"redhat-operators-dmfcr\" (UID: \"f4f754dd-ba60-4f7b-b96b-ac8f2530250c\") " pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.049266 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-catalog-content\") pod \"redhat-operators-dmfcr\" (UID: \"f4f754dd-ba60-4f7b-b96b-ac8f2530250c\") " pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.049343 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-utilities\") pod \"redhat-operators-dmfcr\" (UID: \"f4f754dd-ba60-4f7b-b96b-ac8f2530250c\") " pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.049933 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-utilities\") pod \"redhat-operators-dmfcr\" (UID: \"f4f754dd-ba60-4f7b-b96b-ac8f2530250c\") " pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.050550 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-catalog-content\") pod \"redhat-operators-dmfcr\" (UID: \"f4f754dd-ba60-4f7b-b96b-ac8f2530250c\") " pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.071392 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m47rv\" (UniqueName: \"kubernetes.io/projected/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-kube-api-access-m47rv\") pod \"redhat-operators-dmfcr\" (UID: \"f4f754dd-ba60-4f7b-b96b-ac8f2530250c\") " pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.134530 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.454960 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.461116 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.464087 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.468601 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.468968 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.468978 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-gb2rj" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.492764 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.562312 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.562372 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b557cb95-3ddb-4f51-857f-7e044b7975f3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.562427 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-scripts\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.562483 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-config-data\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.562514 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptcz6\" (UniqueName: \"kubernetes.io/projected/b557cb95-3ddb-4f51-857f-7e044b7975f3-kube-api-access-ptcz6\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.562567 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.564895 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-7h727"] Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.603474 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-ck9lr"] Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.605312 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.619224 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dmfcr"] Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.619285 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.619442 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.625580 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-ck9lr"] Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.663865 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.666215 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b557cb95-3ddb-4f51-857f-7e044b7975f3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.666253 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-scripts\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.666279 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-config-data\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.666300 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.666323 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.666342 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk8km\" (UniqueName: \"kubernetes.io/projected/d3f5feb6-3f16-411f-8bea-22a5badd3b44-kube-api-access-xk8km\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.666368 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptcz6\" (UniqueName: \"kubernetes.io/projected/b557cb95-3ddb-4f51-857f-7e044b7975f3-kube-api-access-ptcz6\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.666398 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.666449 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.666785 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b557cb95-3ddb-4f51-857f-7e044b7975f3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.668147 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-config\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.668227 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.673978 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-config-data\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.691712 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-scripts\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.691757 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.692616 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" event={"ID":"3cd5ec39-81e0-44cd-b99f-01e3d301b192","Type":"ContainerStarted","Data":"ca2fd5357ff120c80ea8f13743c651878f08489433858a2795da208ff4c10af2"} Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.692910 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.697361 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-676f7f866-qr2ck" event={"ID":"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa","Type":"ContainerStarted","Data":"0e192ff1a78289e7a4877820e9e898abbbee52d93fb499e3ba8df5ce67fc2553"} Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.697397 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-676f7f866-qr2ck" event={"ID":"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa","Type":"ContainerStarted","Data":"69cba238106285873331f2043c781dbc4270d7edf1f10b318ab5d42268a7cf29"} Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.697408 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-676f7f866-qr2ck" event={"ID":"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa","Type":"ContainerStarted","Data":"09be4ca1425941bb718a979077c9128c6d7473125a357da85490821cc225b829"} Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.697725 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.697986 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.699583 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptcz6\" (UniqueName: \"kubernetes.io/projected/b557cb95-3ddb-4f51-857f-7e044b7975f3-kube-api-access-ptcz6\") pod \"cinder-scheduler-0\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.702465 4783 generic.go:334] "Generic (PLEG): container finished" podID="7f688143-1580-4aaa-8cbe-9368c7f6c0b2" containerID="d0ea84461d9f22e06c40321cd19c7080bbb0e3da1db5e1fbe90c4a2b80f3b2f5" exitCode=0 Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.702554 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" event={"ID":"7f688143-1580-4aaa-8cbe-9368c7f6c0b2","Type":"ContainerDied","Data":"d0ea84461d9f22e06c40321cd19c7080bbb0e3da1db5e1fbe90c4a2b80f3b2f5"} Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.702585 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" event={"ID":"7f688143-1580-4aaa-8cbe-9368c7f6c0b2","Type":"ContainerStarted","Data":"ed7a2cfc1131d57c5fe0f209f6b974cd48101b5686a6cb98e7949477ea5b771c"} Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.711731 4783 generic.go:334] "Generic (PLEG): container finished" podID="c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c" containerID="f9a65763325f9fa5fb8673ebfc7c3c83f8e8a5e5fc58da30c9e46367fd5e18a6" exitCode=0 Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.711803 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k8d2" event={"ID":"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c","Type":"ContainerDied","Data":"f9a65763325f9fa5fb8673ebfc7c3c83f8e8a5e5fc58da30c9e46367fd5e18a6"} Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.711828 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k8d2" event={"ID":"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c","Type":"ContainerStarted","Data":"a155f2f40a6d62b3aca390353ce90204bc0593d6690e09d3cda7c183316d6693"} Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.714382 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c946647fc-lsk5p" event={"ID":"814c0372-f441-4fce-b7d3-47827597fdd5","Type":"ContainerStarted","Data":"fd9c78a98873f5ba5e8aac367c8495600769e4943ae5b32782e5840a16e7bd77"} Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.737741 4783 generic.go:334] "Generic (PLEG): container finished" podID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerID="013c28206759b2989b31fbc7152bd1a5e01f77975e280fc08d079e88b518cc02" exitCode=0 Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.737766 4783 generic.go:334] "Generic (PLEG): container finished" podID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerID="9bf9dec205d42c2fdaeed84eec56a1c8d8c327c7e74b9351e877e25aa1eea0b2" exitCode=2 Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.737775 4783 generic.go:334] "Generic (PLEG): container finished" podID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerID="3adbf3400e228d9066ac644318558c4c47c5c19173652a94f1aa77cf80a3adc6" exitCode=0 Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.738206 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e","Type":"ContainerDied","Data":"013c28206759b2989b31fbc7152bd1a5e01f77975e280fc08d079e88b518cc02"} Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.738263 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e","Type":"ContainerDied","Data":"9bf9dec205d42c2fdaeed84eec56a1c8d8c327c7e74b9351e877e25aa1eea0b2"} Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.738275 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e","Type":"ContainerDied","Data":"3adbf3400e228d9066ac644318558c4c47c5c19173652a94f1aa77cf80a3adc6"} Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.771066 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.771145 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-config\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.771360 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.771477 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.771531 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.771558 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk8km\" (UniqueName: \"kubernetes.io/projected/d3f5feb6-3f16-411f-8bea-22a5badd3b44-kube-api-access-xk8km\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.772364 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-config\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.772485 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.772748 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.772947 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.773149 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.784719 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.786545 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.788593 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.792615 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.792852 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk8km\" (UniqueName: \"kubernetes.io/projected/d3f5feb6-3f16-411f-8bea-22a5badd3b44-kube-api-access-xk8km\") pod \"dnsmasq-dns-75bfc9b94f-ck9lr\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.796997 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.862008 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-676f7f866-qr2ck" podStartSLOduration=3.8619860089999998 podStartE2EDuration="3.861986009s" podCreationTimestamp="2026-01-31 09:19:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:19.818745605 +0000 UTC m=+870.487429264" watchObservedRunningTime="2026-01-31 09:19:19.861986009 +0000 UTC m=+870.530669476" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.874957 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-config-data\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.875251 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gsjv\" (UniqueName: \"kubernetes.io/projected/043ade57-8838-4359-96dc-aa6dd0d8dfa8-kube-api-access-6gsjv\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.875521 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-scripts\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.875607 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-config-data-custom\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.875747 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/043ade57-8838-4359-96dc-aa6dd0d8dfa8-logs\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.875821 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/043ade57-8838-4359-96dc-aa6dd0d8dfa8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.875891 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.960443 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.978720 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-scripts\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.978777 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-config-data-custom\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.978849 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/043ade57-8838-4359-96dc-aa6dd0d8dfa8-logs\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.978877 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/043ade57-8838-4359-96dc-aa6dd0d8dfa8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.978907 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.978938 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-config-data\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.978970 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gsjv\" (UniqueName: \"kubernetes.io/projected/043ade57-8838-4359-96dc-aa6dd0d8dfa8-kube-api-access-6gsjv\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.979910 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7994d94564-47gt2"] Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.980099 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/043ade57-8838-4359-96dc-aa6dd0d8dfa8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.981780 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.985517 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/043ade57-8838-4359-96dc-aa6dd0d8dfa8-logs\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.988649 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.989585 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-scripts\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.989888 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-config-data\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.998139 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 31 09:19:19 crc kubenswrapper[4783]: I0131 09:19:19.998900 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-config-data-custom\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.001799 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.008812 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7994d94564-47gt2"] Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.042709 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gsjv\" (UniqueName: \"kubernetes.io/projected/043ade57-8838-4359-96dc-aa6dd0d8dfa8-kube-api-access-6gsjv\") pod \"cinder-api-0\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " pod="openstack/cinder-api-0" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.083123 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec6cc70-cb74-40f6-acb3-3423b5045651-config-data\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.083228 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ec6cc70-cb74-40f6-acb3-3423b5045651-internal-tls-certs\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.083300 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bdps\" (UniqueName: \"kubernetes.io/projected/6ec6cc70-cb74-40f6-acb3-3423b5045651-kube-api-access-5bdps\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.083334 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ec6cc70-cb74-40f6-acb3-3423b5045651-public-tls-certs\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.083361 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ec6cc70-cb74-40f6-acb3-3423b5045651-config-data-custom\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.083393 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec6cc70-cb74-40f6-acb3-3423b5045651-combined-ca-bundle\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.083430 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ec6cc70-cb74-40f6-acb3-3423b5045651-logs\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.121387 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.184772 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ec6cc70-cb74-40f6-acb3-3423b5045651-internal-tls-certs\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.184881 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bdps\" (UniqueName: \"kubernetes.io/projected/6ec6cc70-cb74-40f6-acb3-3423b5045651-kube-api-access-5bdps\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.184918 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ec6cc70-cb74-40f6-acb3-3423b5045651-public-tls-certs\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.184948 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ec6cc70-cb74-40f6-acb3-3423b5045651-config-data-custom\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.184993 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec6cc70-cb74-40f6-acb3-3423b5045651-combined-ca-bundle\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.185042 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ec6cc70-cb74-40f6-acb3-3423b5045651-logs\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.185071 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec6cc70-cb74-40f6-acb3-3423b5045651-config-data\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.189480 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ec6cc70-cb74-40f6-acb3-3423b5045651-logs\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.198651 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ec6cc70-cb74-40f6-acb3-3423b5045651-config-data-custom\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.198816 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ec6cc70-cb74-40f6-acb3-3423b5045651-internal-tls-certs\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.199113 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec6cc70-cb74-40f6-acb3-3423b5045651-config-data\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.205299 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ec6cc70-cb74-40f6-acb3-3423b5045651-public-tls-certs\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.216405 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec6cc70-cb74-40f6-acb3-3423b5045651-combined-ca-bundle\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.225731 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bdps\" (UniqueName: \"kubernetes.io/projected/6ec6cc70-cb74-40f6-acb3-3423b5045651-kube-api-access-5bdps\") pod \"barbican-api-7994d94564-47gt2\" (UID: \"6ec6cc70-cb74-40f6-acb3-3423b5045651\") " pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: E0131 09:19:20.307114 4783 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Jan 31 09:19:20 crc kubenswrapper[4783]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/7f688143-1580-4aaa-8cbe-9368c7f6c0b2/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 31 09:19:20 crc kubenswrapper[4783]: > podSandboxID="ed7a2cfc1131d57c5fe0f209f6b974cd48101b5686a6cb98e7949477ea5b771c" Jan 31 09:19:20 crc kubenswrapper[4783]: E0131 09:19:20.307381 4783 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 31 09:19:20 crc kubenswrapper[4783]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n99h8bhd9h696h649h588h5c6h658h5b4h57fh65h89h5f5h56h696h5dh8h57h597h68ch568h58dh66hf4h675h598h588h67dhb5h69h5dh6bq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8lj8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7bdf86f46f-7h727_openstack(7f688143-1580-4aaa-8cbe-9368c7f6c0b2): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/7f688143-1580-4aaa-8cbe-9368c7f6c0b2/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 31 09:19:20 crc kubenswrapper[4783]: > logger="UnhandledError" Jan 31 09:19:20 crc kubenswrapper[4783]: E0131 09:19:20.308508 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/7f688143-1580-4aaa-8cbe-9368c7f6c0b2/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" podUID="7f688143-1580-4aaa-8cbe-9368c7f6c0b2" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.311796 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:19:20 crc kubenswrapper[4783]: W0131 09:19:20.341661 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb557cb95_3ddb_4f51_857f_7e044b7975f3.slice/crio-9607c34d37758ead039cb10489743048b74467f5c77e50cf34302b21b22c1df3 WatchSource:0}: Error finding container 9607c34d37758ead039cb10489743048b74467f5c77e50cf34302b21b22c1df3: Status 404 returned error can't find the container with id 9607c34d37758ead039cb10489743048b74467f5c77e50cf34302b21b22c1df3 Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.510757 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:20 crc kubenswrapper[4783]: E0131 09:19:20.534114 4783 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode52e03f3_fa41_487d_affe_89222406f4bb.slice/crio-84fdab98a2affa05717215feb4e4383412a44843ea778db228bbfc3433d763c6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f688143_1580_4aaa_8cbe_9368c7f6c0b2.slice/crio-cbaed06a87bb85a227b97753ab6774b48901f5fca1441eefd527ed18de88ae51.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f688143_1580_4aaa_8cbe_9368c7f6c0b2.slice/crio-conmon-cbaed06a87bb85a227b97753ab6774b48901f5fca1441eefd527ed18de88ae51.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca7e6f7a_4b59_42fd_9ef2_4f761e2d0af9.slice\": RecentStats: unable to find data in memory cache]" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.678866 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.697401 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.720988 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-ck9lr"] Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.731296 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-vxv6q" podUID="43fedf89-07db-4c75-a8c3-9f04bbc9c6f7" containerName="registry-server" probeResult="failure" output=< Jan 31 09:19:20 crc kubenswrapper[4783]: timeout: failed to connect service ":50051" within 1s Jan 31 09:19:20 crc kubenswrapper[4783]: > Jan 31 09:19:20 crc kubenswrapper[4783]: W0131 09:19:20.739701 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod043ade57_8838_4359_96dc_aa6dd0d8dfa8.slice/crio-0c2979e2b3c6459643a217dad4985b24002a88e01930684cd154210d4dbad41d WatchSource:0}: Error finding container 0c2979e2b3c6459643a217dad4985b24002a88e01930684cd154210d4dbad41d: Status 404 returned error can't find the container with id 0c2979e2b3c6459643a217dad4985b24002a88e01930684cd154210d4dbad41d Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.753395 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b557cb95-3ddb-4f51-857f-7e044b7975f3","Type":"ContainerStarted","Data":"9607c34d37758ead039cb10489743048b74467f5c77e50cf34302b21b22c1df3"} Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.756480 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" event={"ID":"d3f5feb6-3f16-411f-8bea-22a5badd3b44","Type":"ContainerStarted","Data":"0c117cbda33455270d4d9caa2b7fe2b38f98e246fe8ae13199fe34d55837efd2"} Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.759218 4783 generic.go:334] "Generic (PLEG): container finished" podID="f4f754dd-ba60-4f7b-b96b-ac8f2530250c" containerID="c1dc630b83a51d2dcba2bae844cc4212d79e8d89278eba7264f1d2987846d042" exitCode=0 Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.759273 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmfcr" event={"ID":"f4f754dd-ba60-4f7b-b96b-ac8f2530250c","Type":"ContainerDied","Data":"c1dc630b83a51d2dcba2bae844cc4212d79e8d89278eba7264f1d2987846d042"} Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.759297 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmfcr" event={"ID":"f4f754dd-ba60-4f7b-b96b-ac8f2530250c","Type":"ContainerStarted","Data":"52457fa5f831f47e074ecd4c7e736c7748e6a2513a1475254742614bed8e509d"} Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.766770 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k8d2" event={"ID":"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c","Type":"ContainerStarted","Data":"4b60300b010e61250925218d84bf08058c2795d1d60aa931f0cd048ffc9024c5"} Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.966864 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-69d7fc9755-j47fz"] Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.969294 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-69d7fc9755-j47fz" podUID="4276e01a-227a-4370-8b9b-cfc5123aa13d" containerName="neutron-api" containerID="cri-o://0aaee8a39b35d3d99a54863e0fd33968badf4049ade7b35aaf9d948501d50975" gracePeriod=30 Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.970046 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-69d7fc9755-j47fz" podUID="4276e01a-227a-4370-8b9b-cfc5123aa13d" containerName="neutron-httpd" containerID="cri-o://63fb2d8722bb74c63b5eb99153805cee774c4fe20b37ba7479fec958c8e8da20" gracePeriod=30 Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.993308 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-754bf5467-627tt"] Jan 31 09:19:20 crc kubenswrapper[4783]: I0131 09:19:20.994725 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.049372 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-754bf5467-627tt"] Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.060579 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-69d7fc9755-j47fz" podUID="4276e01a-227a-4370-8b9b-cfc5123aa13d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.150:9696/\": EOF" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.116493 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-config\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.116571 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-public-tls-certs\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.116612 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7jj2\" (UniqueName: \"kubernetes.io/projected/e522ac0d-9e88-42f8-82a7-54cb22a15841-kube-api-access-b7jj2\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.116713 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-combined-ca-bundle\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.116773 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-httpd-config\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.116805 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-ovndb-tls-certs\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.116915 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-internal-tls-certs\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.137449 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7994d94564-47gt2"] Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.218961 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-public-tls-certs\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.219020 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7jj2\" (UniqueName: \"kubernetes.io/projected/e522ac0d-9e88-42f8-82a7-54cb22a15841-kube-api-access-b7jj2\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.219085 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-combined-ca-bundle\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.219119 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-httpd-config\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.219141 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-ovndb-tls-certs\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.219215 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-internal-tls-certs\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.219260 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-config\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.225280 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-config\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.228016 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-httpd-config\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.228719 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-internal-tls-certs\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.234809 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-combined-ca-bundle\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.234912 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-ovndb-tls-certs\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.236979 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7jj2\" (UniqueName: \"kubernetes.io/projected/e522ac0d-9e88-42f8-82a7-54cb22a15841-kube-api-access-b7jj2\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.238358 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e522ac0d-9e88-42f8-82a7-54cb22a15841-public-tls-certs\") pod \"neutron-754bf5467-627tt\" (UID: \"e522ac0d-9e88-42f8-82a7-54cb22a15841\") " pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.338110 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.774243 4783 generic.go:334] "Generic (PLEG): container finished" podID="c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c" containerID="4b60300b010e61250925218d84bf08058c2795d1d60aa931f0cd048ffc9024c5" exitCode=0 Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.774347 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k8d2" event={"ID":"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c","Type":"ContainerDied","Data":"4b60300b010e61250925218d84bf08058c2795d1d60aa931f0cd048ffc9024c5"} Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.777885 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"043ade57-8838-4359-96dc-aa6dd0d8dfa8","Type":"ContainerStarted","Data":"cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7"} Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.777928 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"043ade57-8838-4359-96dc-aa6dd0d8dfa8","Type":"ContainerStarted","Data":"0c2979e2b3c6459643a217dad4985b24002a88e01930684cd154210d4dbad41d"} Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.784107 4783 generic.go:334] "Generic (PLEG): container finished" podID="4276e01a-227a-4370-8b9b-cfc5123aa13d" containerID="63fb2d8722bb74c63b5eb99153805cee774c4fe20b37ba7479fec958c8e8da20" exitCode=0 Jan 31 09:19:21 crc kubenswrapper[4783]: I0131 09:19:21.784729 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d7fc9755-j47fz" event={"ID":"4276e01a-227a-4370-8b9b-cfc5123aa13d","Type":"ContainerDied","Data":"63fb2d8722bb74c63b5eb99153805cee774c4fe20b37ba7479fec958c8e8da20"} Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.123851 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.242891 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-dns-svc\") pod \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.243246 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lj8n\" (UniqueName: \"kubernetes.io/projected/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-kube-api-access-8lj8n\") pod \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.243346 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-dns-swift-storage-0\") pod \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.243422 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-config\") pod \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.243452 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-ovsdbserver-nb\") pod \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.243495 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-ovsdbserver-sb\") pod \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\" (UID: \"7f688143-1580-4aaa-8cbe-9368c7f6c0b2\") " Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.260381 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-kube-api-access-8lj8n" (OuterVolumeSpecName: "kube-api-access-8lj8n") pod "7f688143-1580-4aaa-8cbe-9368c7f6c0b2" (UID: "7f688143-1580-4aaa-8cbe-9368c7f6c0b2"). InnerVolumeSpecName "kube-api-access-8lj8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.298658 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-config" (OuterVolumeSpecName: "config") pod "7f688143-1580-4aaa-8cbe-9368c7f6c0b2" (UID: "7f688143-1580-4aaa-8cbe-9368c7f6c0b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.310696 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f688143-1580-4aaa-8cbe-9368c7f6c0b2" (UID: "7f688143-1580-4aaa-8cbe-9368c7f6c0b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.315698 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7f688143-1580-4aaa-8cbe-9368c7f6c0b2" (UID: "7f688143-1580-4aaa-8cbe-9368c7f6c0b2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.317730 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7f688143-1580-4aaa-8cbe-9368c7f6c0b2" (UID: "7f688143-1580-4aaa-8cbe-9368c7f6c0b2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.325844 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7f688143-1580-4aaa-8cbe-9368c7f6c0b2" (UID: "7f688143-1580-4aaa-8cbe-9368c7f6c0b2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.346235 4783 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.346269 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lj8n\" (UniqueName: \"kubernetes.io/projected/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-kube-api-access-8lj8n\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.346285 4783 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.346297 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.346306 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.346316 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f688143-1580-4aaa-8cbe-9368c7f6c0b2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.365915 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.447512 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8h26\" (UniqueName: \"kubernetes.io/projected/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-kube-api-access-f8h26\") pod \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.447628 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-combined-ca-bundle\") pod \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.447683 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-run-httpd\") pod \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.447716 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-log-httpd\") pod \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.447756 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-scripts\") pod \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.447822 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-sg-core-conf-yaml\") pod \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.447847 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-config-data\") pod \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\" (UID: \"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e\") " Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.448618 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" (UID: "8017cd3e-e5c3-4b9c-a3ac-8a155e67119e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.449235 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" (UID: "8017cd3e-e5c3-4b9c-a3ac-8a155e67119e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.455137 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-scripts" (OuterVolumeSpecName: "scripts") pod "8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" (UID: "8017cd3e-e5c3-4b9c-a3ac-8a155e67119e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.455443 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-kube-api-access-f8h26" (OuterVolumeSpecName: "kube-api-access-f8h26") pod "8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" (UID: "8017cd3e-e5c3-4b9c-a3ac-8a155e67119e"). InnerVolumeSpecName "kube-api-access-f8h26". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.550544 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8h26\" (UniqueName: \"kubernetes.io/projected/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-kube-api-access-f8h26\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.550573 4783 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.550583 4783 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.550591 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.602942 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-754bf5467-627tt"] Jan 31 09:19:22 crc kubenswrapper[4783]: W0131 09:19:22.646518 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode522ac0d_9e88_42f8_82a7_54cb22a15841.slice/crio-f9dd7d296639fd6c71527420b86bc1843b2fdb6e8fcdc853b47f20820b0fe9ba WatchSource:0}: Error finding container f9dd7d296639fd6c71527420b86bc1843b2fdb6e8fcdc853b47f20820b0fe9ba: Status 404 returned error can't find the container with id f9dd7d296639fd6c71527420b86bc1843b2fdb6e8fcdc853b47f20820b0fe9ba Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.663317 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-69d7fc9755-j47fz" podUID="4276e01a-227a-4370-8b9b-cfc5123aa13d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.150:9696/\": dial tcp 10.217.0.150:9696: connect: connection refused" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.690069 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" (UID: "8017cd3e-e5c3-4b9c-a3ac-8a155e67119e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.728871 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" (UID: "8017cd3e-e5c3-4b9c-a3ac-8a155e67119e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.754526 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.754551 4783 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.756311 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-config-data" (OuterVolumeSpecName: "config-data") pod "8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" (UID: "8017cd3e-e5c3-4b9c-a3ac-8a155e67119e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.812211 4783 generic.go:334] "Generic (PLEG): container finished" podID="d3f5feb6-3f16-411f-8bea-22a5badd3b44" containerID="8b3fe46c4e5e2878ac60abc51b076bcb93568b4db9c3fc5df7ce21d3ccb5b2a8" exitCode=0 Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.812277 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" event={"ID":"d3f5feb6-3f16-411f-8bea-22a5badd3b44","Type":"ContainerDied","Data":"8b3fe46c4e5e2878ac60abc51b076bcb93568b4db9c3fc5df7ce21d3ccb5b2a8"} Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.822173 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c946647fc-lsk5p" event={"ID":"814c0372-f441-4fce-b7d3-47827597fdd5","Type":"ContainerStarted","Data":"bacfc0d4efb2ddb8a4ed092e0ae1a6faa23978f251df15d517decddafc584473"} Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.824481 4783 generic.go:334] "Generic (PLEG): container finished" podID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerID="adcab2008d1698aeceedc72c8095b7356b8d7c69ce11c1f9ac18f86d09ddbe7a" exitCode=0 Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.824526 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e","Type":"ContainerDied","Data":"adcab2008d1698aeceedc72c8095b7356b8d7c69ce11c1f9ac18f86d09ddbe7a"} Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.824566 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8017cd3e-e5c3-4b9c-a3ac-8a155e67119e","Type":"ContainerDied","Data":"afb5f3344ad57551329d1cccec8e3d382bd8d10a9cbc5cf621df6b0751f0874e"} Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.824600 4783 scope.go:117] "RemoveContainer" containerID="013c28206759b2989b31fbc7152bd1a5e01f77975e280fc08d079e88b518cc02" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.824699 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.830068 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7994d94564-47gt2" event={"ID":"6ec6cc70-cb74-40f6-acb3-3423b5045651","Type":"ContainerStarted","Data":"9c1f42b15fa33616ea0abcdf47a3ec9a41ceb6f44bb182e5f6a1a78ddd98cc5c"} Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.830110 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7994d94564-47gt2" event={"ID":"6ec6cc70-cb74-40f6-acb3-3423b5045651","Type":"ContainerStarted","Data":"b2222fd663bdf8d14d00c8a4df85b66667adf806f9728b95f3a399d5d77cd08b"} Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.836792 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-754bf5467-627tt" event={"ID":"e522ac0d-9e88-42f8-82a7-54cb22a15841","Type":"ContainerStarted","Data":"f9dd7d296639fd6c71527420b86bc1843b2fdb6e8fcdc853b47f20820b0fe9ba"} Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.851766 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" event={"ID":"3cd5ec39-81e0-44cd-b99f-01e3d301b192","Type":"ContainerStarted","Data":"d89ef87d521586f373da50324c9311e14c4c7becb4b6c1e7314b2ff67dc62528"} Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.856981 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.861040 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmfcr" event={"ID":"f4f754dd-ba60-4f7b-b96b-ac8f2530250c","Type":"ContainerStarted","Data":"ae7fce32118e69b843e67e5267f1b57a0b020c45a6d104f24057490ed9b3003d"} Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.863074 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" event={"ID":"7f688143-1580-4aaa-8cbe-9368c7f6c0b2","Type":"ContainerDied","Data":"ed7a2cfc1131d57c5fe0f209f6b974cd48101b5686a6cb98e7949477ea5b771c"} Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.863172 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdf86f46f-7h727" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.892260 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.911568 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.936286 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:19:22 crc kubenswrapper[4783]: E0131 09:19:22.936703 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerName="sg-core" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.936718 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerName="sg-core" Jan 31 09:19:22 crc kubenswrapper[4783]: E0131 09:19:22.936743 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f688143-1580-4aaa-8cbe-9368c7f6c0b2" containerName="init" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.936748 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f688143-1580-4aaa-8cbe-9368c7f6c0b2" containerName="init" Jan 31 09:19:22 crc kubenswrapper[4783]: E0131 09:19:22.936756 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerName="proxy-httpd" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.936763 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerName="proxy-httpd" Jan 31 09:19:22 crc kubenswrapper[4783]: E0131 09:19:22.936785 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerName="ceilometer-central-agent" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.936791 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerName="ceilometer-central-agent" Jan 31 09:19:22 crc kubenswrapper[4783]: E0131 09:19:22.936806 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerName="ceilometer-notification-agent" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.936812 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerName="ceilometer-notification-agent" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.936995 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerName="sg-core" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.937009 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerName="ceilometer-notification-agent" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.937015 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f688143-1580-4aaa-8cbe-9368c7f6c0b2" containerName="init" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.937030 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerName="ceilometer-central-agent" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.937043 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" containerName="proxy-httpd" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.939346 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.947738 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.947923 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.975236 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:19:22 crc kubenswrapper[4783]: I0131 09:19:22.996897 4783 scope.go:117] "RemoveContainer" containerID="9bf9dec205d42c2fdaeed84eec56a1c8d8c327c7e74b9351e877e25aa1eea0b2" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.036478 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-7h727"] Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.046248 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-7h727"] Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.055214 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.069045 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.069086 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzsqf\" (UniqueName: \"kubernetes.io/projected/1651d956-f474-4825-a9a7-c9a350d3e2b3-kube-api-access-kzsqf\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.069108 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-config-data\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.069150 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1651d956-f474-4825-a9a7-c9a350d3e2b3-log-httpd\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.069409 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1651d956-f474-4825-a9a7-c9a350d3e2b3-run-httpd\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.069838 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.069897 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-scripts\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.172726 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.173003 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzsqf\" (UniqueName: \"kubernetes.io/projected/1651d956-f474-4825-a9a7-c9a350d3e2b3-kube-api-access-kzsqf\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.173025 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-config-data\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.173097 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1651d956-f474-4825-a9a7-c9a350d3e2b3-log-httpd\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.173269 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1651d956-f474-4825-a9a7-c9a350d3e2b3-run-httpd\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.173315 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.173339 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-scripts\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.174309 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1651d956-f474-4825-a9a7-c9a350d3e2b3-run-httpd\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.176667 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1651d956-f474-4825-a9a7-c9a350d3e2b3-log-httpd\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.185433 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-config-data\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.185803 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-scripts\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.185905 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.195959 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.203823 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzsqf\" (UniqueName: \"kubernetes.io/projected/1651d956-f474-4825-a9a7-c9a350d3e2b3-kube-api-access-kzsqf\") pod \"ceilometer-0\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.301192 4783 scope.go:117] "RemoveContainer" containerID="adcab2008d1698aeceedc72c8095b7356b8d7c69ce11c1f9ac18f86d09ddbe7a" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.324938 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.343094 4783 scope.go:117] "RemoveContainer" containerID="3adbf3400e228d9066ac644318558c4c47c5c19173652a94f1aa77cf80a3adc6" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.366503 4783 scope.go:117] "RemoveContainer" containerID="013c28206759b2989b31fbc7152bd1a5e01f77975e280fc08d079e88b518cc02" Jan 31 09:19:23 crc kubenswrapper[4783]: E0131 09:19:23.368480 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"013c28206759b2989b31fbc7152bd1a5e01f77975e280fc08d079e88b518cc02\": container with ID starting with 013c28206759b2989b31fbc7152bd1a5e01f77975e280fc08d079e88b518cc02 not found: ID does not exist" containerID="013c28206759b2989b31fbc7152bd1a5e01f77975e280fc08d079e88b518cc02" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.368528 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013c28206759b2989b31fbc7152bd1a5e01f77975e280fc08d079e88b518cc02"} err="failed to get container status \"013c28206759b2989b31fbc7152bd1a5e01f77975e280fc08d079e88b518cc02\": rpc error: code = NotFound desc = could not find container \"013c28206759b2989b31fbc7152bd1a5e01f77975e280fc08d079e88b518cc02\": container with ID starting with 013c28206759b2989b31fbc7152bd1a5e01f77975e280fc08d079e88b518cc02 not found: ID does not exist" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.368557 4783 scope.go:117] "RemoveContainer" containerID="9bf9dec205d42c2fdaeed84eec56a1c8d8c327c7e74b9351e877e25aa1eea0b2" Jan 31 09:19:23 crc kubenswrapper[4783]: E0131 09:19:23.368990 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bf9dec205d42c2fdaeed84eec56a1c8d8c327c7e74b9351e877e25aa1eea0b2\": container with ID starting with 9bf9dec205d42c2fdaeed84eec56a1c8d8c327c7e74b9351e877e25aa1eea0b2 not found: ID does not exist" containerID="9bf9dec205d42c2fdaeed84eec56a1c8d8c327c7e74b9351e877e25aa1eea0b2" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.369056 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bf9dec205d42c2fdaeed84eec56a1c8d8c327c7e74b9351e877e25aa1eea0b2"} err="failed to get container status \"9bf9dec205d42c2fdaeed84eec56a1c8d8c327c7e74b9351e877e25aa1eea0b2\": rpc error: code = NotFound desc = could not find container \"9bf9dec205d42c2fdaeed84eec56a1c8d8c327c7e74b9351e877e25aa1eea0b2\": container with ID starting with 9bf9dec205d42c2fdaeed84eec56a1c8d8c327c7e74b9351e877e25aa1eea0b2 not found: ID does not exist" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.369099 4783 scope.go:117] "RemoveContainer" containerID="adcab2008d1698aeceedc72c8095b7356b8d7c69ce11c1f9ac18f86d09ddbe7a" Jan 31 09:19:23 crc kubenswrapper[4783]: E0131 09:19:23.373251 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adcab2008d1698aeceedc72c8095b7356b8d7c69ce11c1f9ac18f86d09ddbe7a\": container with ID starting with adcab2008d1698aeceedc72c8095b7356b8d7c69ce11c1f9ac18f86d09ddbe7a not found: ID does not exist" containerID="adcab2008d1698aeceedc72c8095b7356b8d7c69ce11c1f9ac18f86d09ddbe7a" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.373282 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adcab2008d1698aeceedc72c8095b7356b8d7c69ce11c1f9ac18f86d09ddbe7a"} err="failed to get container status \"adcab2008d1698aeceedc72c8095b7356b8d7c69ce11c1f9ac18f86d09ddbe7a\": rpc error: code = NotFound desc = could not find container \"adcab2008d1698aeceedc72c8095b7356b8d7c69ce11c1f9ac18f86d09ddbe7a\": container with ID starting with adcab2008d1698aeceedc72c8095b7356b8d7c69ce11c1f9ac18f86d09ddbe7a not found: ID does not exist" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.373299 4783 scope.go:117] "RemoveContainer" containerID="3adbf3400e228d9066ac644318558c4c47c5c19173652a94f1aa77cf80a3adc6" Jan 31 09:19:23 crc kubenswrapper[4783]: E0131 09:19:23.376385 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3adbf3400e228d9066ac644318558c4c47c5c19173652a94f1aa77cf80a3adc6\": container with ID starting with 3adbf3400e228d9066ac644318558c4c47c5c19173652a94f1aa77cf80a3adc6 not found: ID does not exist" containerID="3adbf3400e228d9066ac644318558c4c47c5c19173652a94f1aa77cf80a3adc6" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.376412 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3adbf3400e228d9066ac644318558c4c47c5c19173652a94f1aa77cf80a3adc6"} err="failed to get container status \"3adbf3400e228d9066ac644318558c4c47c5c19173652a94f1aa77cf80a3adc6\": rpc error: code = NotFound desc = could not find container \"3adbf3400e228d9066ac644318558c4c47c5c19173652a94f1aa77cf80a3adc6\": container with ID starting with 3adbf3400e228d9066ac644318558c4c47c5c19173652a94f1aa77cf80a3adc6 not found: ID does not exist" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.376427 4783 scope.go:117] "RemoveContainer" containerID="d0ea84461d9f22e06c40321cd19c7080bbb0e3da1db5e1fbe90c4a2b80f3b2f5" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.690391 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f688143-1580-4aaa-8cbe-9368c7f6c0b2" path="/var/lib/kubelet/pods/7f688143-1580-4aaa-8cbe-9368c7f6c0b2/volumes" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.691315 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8017cd3e-e5c3-4b9c-a3ac-8a155e67119e" path="/var/lib/kubelet/pods/8017cd3e-e5c3-4b9c-a3ac-8a155e67119e/volumes" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.905094 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" event={"ID":"3cd5ec39-81e0-44cd-b99f-01e3d301b192","Type":"ContainerStarted","Data":"2c55000e3452c1c5c3aaf7edbcd118c7aa994b657b462a739e4ef31f48c9c6b1"} Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.905820 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.916564 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" event={"ID":"d3f5feb6-3f16-411f-8bea-22a5badd3b44","Type":"ContainerStarted","Data":"78b26e2e79fe847f647a8824f0c43055c512862d8555688726ab9c901f71e964"} Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.918457 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.929599 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5cf88c75b6-glzzx" podStartSLOduration=4.625691665 podStartE2EDuration="7.929588146s" podCreationTimestamp="2026-01-31 09:19:16 +0000 UTC" firstStartedPulling="2026-01-31 09:19:18.917473029 +0000 UTC m=+869.586156497" lastFinishedPulling="2026-01-31 09:19:22.22136951 +0000 UTC m=+872.890052978" observedRunningTime="2026-01-31 09:19:23.921354074 +0000 UTC m=+874.590037542" watchObservedRunningTime="2026-01-31 09:19:23.929588146 +0000 UTC m=+874.598271605" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.950021 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" podStartSLOduration=4.950002701 podStartE2EDuration="4.950002701s" podCreationTimestamp="2026-01-31 09:19:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:23.947474996 +0000 UTC m=+874.616158464" watchObservedRunningTime="2026-01-31 09:19:23.950002701 +0000 UTC m=+874.618686169" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.974095 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c946647fc-lsk5p" event={"ID":"814c0372-f441-4fce-b7d3-47827597fdd5","Type":"ContainerStarted","Data":"b9c1db650c6a694ded5c083e336a4ceb741f82dd9841f647789971b898719535"} Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.981020 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="043ade57-8838-4359-96dc-aa6dd0d8dfa8" containerName="cinder-api-log" containerID="cri-o://cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7" gracePeriod=30 Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.981453 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="043ade57-8838-4359-96dc-aa6dd0d8dfa8" containerName="cinder-api" containerID="cri-o://3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1" gracePeriod=30 Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.981720 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"043ade57-8838-4359-96dc-aa6dd0d8dfa8","Type":"ContainerStarted","Data":"3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1"} Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.981775 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.994926 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7c946647fc-lsk5p" podStartSLOduration=4.722743189 podStartE2EDuration="7.994913422s" podCreationTimestamp="2026-01-31 09:19:16 +0000 UTC" firstStartedPulling="2026-01-31 09:19:18.739924458 +0000 UTC m=+869.408607926" lastFinishedPulling="2026-01-31 09:19:22.012094691 +0000 UTC m=+872.680778159" observedRunningTime="2026-01-31 09:19:23.99323044 +0000 UTC m=+874.661913908" watchObservedRunningTime="2026-01-31 09:19:23.994913422 +0000 UTC m=+874.663596890" Jan 31 09:19:23 crc kubenswrapper[4783]: I0131 09:19:23.997132 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b557cb95-3ddb-4f51-857f-7e044b7975f3","Type":"ContainerStarted","Data":"b619429584aed0ba814e6d1fddab79ace3d922033a55f1aa9e0c15e390fd66fc"} Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.005339 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-754bf5467-627tt" event={"ID":"e522ac0d-9e88-42f8-82a7-54cb22a15841","Type":"ContainerStarted","Data":"1c9ac3a6f0d26c08a88768e5cf243db0553cc70392e586c37e33daaed0a998b1"} Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.005383 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-754bf5467-627tt" event={"ID":"e522ac0d-9e88-42f8-82a7-54cb22a15841","Type":"ContainerStarted","Data":"61bb4fff952c65a4e1bd2654d245f98d44da2a57d497b95485803f2224517813"} Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.006301 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.012513 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.012501659 podStartE2EDuration="5.012501659s" podCreationTimestamp="2026-01-31 09:19:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:24.009973322 +0000 UTC m=+874.678656790" watchObservedRunningTime="2026-01-31 09:19:24.012501659 +0000 UTC m=+874.681185127" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.032297 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7994d94564-47gt2" event={"ID":"6ec6cc70-cb74-40f6-acb3-3423b5045651","Type":"ContainerStarted","Data":"8fe7d7b46ca93743d5bfe414258f88fa2c2d22c542e1c905792f143b6f4ad2f4"} Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.032985 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.033007 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.034259 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-754bf5467-627tt" podStartSLOduration=4.034250808 podStartE2EDuration="4.034250808s" podCreationTimestamp="2026-01-31 09:19:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:24.033288875 +0000 UTC m=+874.701972333" watchObservedRunningTime="2026-01-31 09:19:24.034250808 +0000 UTC m=+874.702934267" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.064847 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7994d94564-47gt2" podStartSLOduration=5.064829933 podStartE2EDuration="5.064829933s" podCreationTimestamp="2026-01-31 09:19:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:24.046095076 +0000 UTC m=+874.714778544" watchObservedRunningTime="2026-01-31 09:19:24.064829933 +0000 UTC m=+874.733513401" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.065295 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k8d2" event={"ID":"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c","Type":"ContainerStarted","Data":"92372dd5825e3ccfb353e71d962aa07df39af0b357f43168a070b8efc33bafaa"} Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.097835 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7k8d2" podStartSLOduration=6.364682301 podStartE2EDuration="9.097818751s" podCreationTimestamp="2026-01-31 09:19:15 +0000 UTC" firstStartedPulling="2026-01-31 09:19:19.714193121 +0000 UTC m=+870.382876579" lastFinishedPulling="2026-01-31 09:19:22.447329561 +0000 UTC m=+873.116013029" observedRunningTime="2026-01-31 09:19:24.084850194 +0000 UTC m=+874.753533662" watchObservedRunningTime="2026-01-31 09:19:24.097818751 +0000 UTC m=+874.766502220" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.693710 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.824834 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/043ade57-8838-4359-96dc-aa6dd0d8dfa8-logs\") pod \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.824884 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/043ade57-8838-4359-96dc-aa6dd0d8dfa8-etc-machine-id\") pod \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.824958 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-config-data\") pod \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.825041 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-combined-ca-bundle\") pod \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.825031 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/043ade57-8838-4359-96dc-aa6dd0d8dfa8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "043ade57-8838-4359-96dc-aa6dd0d8dfa8" (UID: "043ade57-8838-4359-96dc-aa6dd0d8dfa8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.825086 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gsjv\" (UniqueName: \"kubernetes.io/projected/043ade57-8838-4359-96dc-aa6dd0d8dfa8-kube-api-access-6gsjv\") pod \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.825181 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-scripts\") pod \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.825209 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-config-data-custom\") pod \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\" (UID: \"043ade57-8838-4359-96dc-aa6dd0d8dfa8\") " Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.825307 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/043ade57-8838-4359-96dc-aa6dd0d8dfa8-logs" (OuterVolumeSpecName: "logs") pod "043ade57-8838-4359-96dc-aa6dd0d8dfa8" (UID: "043ade57-8838-4359-96dc-aa6dd0d8dfa8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.825542 4783 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/043ade57-8838-4359-96dc-aa6dd0d8dfa8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.832555 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/043ade57-8838-4359-96dc-aa6dd0d8dfa8-kube-api-access-6gsjv" (OuterVolumeSpecName: "kube-api-access-6gsjv") pod "043ade57-8838-4359-96dc-aa6dd0d8dfa8" (UID: "043ade57-8838-4359-96dc-aa6dd0d8dfa8"). InnerVolumeSpecName "kube-api-access-6gsjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.833282 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "043ade57-8838-4359-96dc-aa6dd0d8dfa8" (UID: "043ade57-8838-4359-96dc-aa6dd0d8dfa8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.834241 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-scripts" (OuterVolumeSpecName: "scripts") pod "043ade57-8838-4359-96dc-aa6dd0d8dfa8" (UID: "043ade57-8838-4359-96dc-aa6dd0d8dfa8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.862345 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "043ade57-8838-4359-96dc-aa6dd0d8dfa8" (UID: "043ade57-8838-4359-96dc-aa6dd0d8dfa8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.877259 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-config-data" (OuterVolumeSpecName: "config-data") pod "043ade57-8838-4359-96dc-aa6dd0d8dfa8" (UID: "043ade57-8838-4359-96dc-aa6dd0d8dfa8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.928339 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gsjv\" (UniqueName: \"kubernetes.io/projected/043ade57-8838-4359-96dc-aa6dd0d8dfa8-kube-api-access-6gsjv\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.928374 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.928385 4783 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.928412 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/043ade57-8838-4359-96dc-aa6dd0d8dfa8-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.928420 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:24 crc kubenswrapper[4783]: I0131 09:19:24.928429 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043ade57-8838-4359-96dc-aa6dd0d8dfa8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.075478 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1651d956-f474-4825-a9a7-c9a350d3e2b3","Type":"ContainerStarted","Data":"62547da9a9d9e7776b7d2cb74df76089a4ad63f3d057be33997a9f3d43c7d067"} Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.075525 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1651d956-f474-4825-a9a7-c9a350d3e2b3","Type":"ContainerStarted","Data":"332623a71829e75f4be7d85100bd1a5193305cdc96ecc7aa6ca2a51251e5bf36"} Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.077924 4783 generic.go:334] "Generic (PLEG): container finished" podID="043ade57-8838-4359-96dc-aa6dd0d8dfa8" containerID="3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1" exitCode=0 Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.077959 4783 generic.go:334] "Generic (PLEG): container finished" podID="043ade57-8838-4359-96dc-aa6dd0d8dfa8" containerID="cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7" exitCode=143 Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.078007 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.078021 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"043ade57-8838-4359-96dc-aa6dd0d8dfa8","Type":"ContainerDied","Data":"3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1"} Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.078052 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"043ade57-8838-4359-96dc-aa6dd0d8dfa8","Type":"ContainerDied","Data":"cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7"} Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.078064 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"043ade57-8838-4359-96dc-aa6dd0d8dfa8","Type":"ContainerDied","Data":"0c2979e2b3c6459643a217dad4985b24002a88e01930684cd154210d4dbad41d"} Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.078085 4783 scope.go:117] "RemoveContainer" containerID="3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.080066 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b557cb95-3ddb-4f51-857f-7e044b7975f3","Type":"ContainerStarted","Data":"98cbff8088c837cfc6cab464192835263274b4aa0653a5e965292b0636b7db5a"} Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.101548 4783 scope.go:117] "RemoveContainer" containerID="cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.111511 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.164205638 podStartE2EDuration="6.111487829s" podCreationTimestamp="2026-01-31 09:19:19 +0000 UTC" firstStartedPulling="2026-01-31 09:19:20.348015563 +0000 UTC m=+871.016699031" lastFinishedPulling="2026-01-31 09:19:22.295297753 +0000 UTC m=+872.963981222" observedRunningTime="2026-01-31 09:19:25.104571802 +0000 UTC m=+875.773255270" watchObservedRunningTime="2026-01-31 09:19:25.111487829 +0000 UTC m=+875.780171296" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.124106 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.149520 4783 scope.go:117] "RemoveContainer" containerID="3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.150070 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:19:25 crc kubenswrapper[4783]: E0131 09:19:25.152061 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1\": container with ID starting with 3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1 not found: ID does not exist" containerID="3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.152102 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1"} err="failed to get container status \"3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1\": rpc error: code = NotFound desc = could not find container \"3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1\": container with ID starting with 3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1 not found: ID does not exist" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.152126 4783 scope.go:117] "RemoveContainer" containerID="cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7" Jan 31 09:19:25 crc kubenswrapper[4783]: E0131 09:19:25.154096 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7\": container with ID starting with cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7 not found: ID does not exist" containerID="cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.154137 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7"} err="failed to get container status \"cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7\": rpc error: code = NotFound desc = could not find container \"cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7\": container with ID starting with cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7 not found: ID does not exist" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.154186 4783 scope.go:117] "RemoveContainer" containerID="3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.154462 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1"} err="failed to get container status \"3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1\": rpc error: code = NotFound desc = could not find container \"3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1\": container with ID starting with 3d891f33983d6d57be1fba28d8b9ba6dabe9864955f45261c26dce848c93f1a1 not found: ID does not exist" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.154494 4783 scope.go:117] "RemoveContainer" containerID="cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.162819 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.166070 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7"} err="failed to get container status \"cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7\": rpc error: code = NotFound desc = could not find container \"cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7\": container with ID starting with cdb853a07c5e0228936718f83c2ff3eb5667a0551f5e1069a0508ef93a11c9d7 not found: ID does not exist" Jan 31 09:19:25 crc kubenswrapper[4783]: E0131 09:19:25.172013 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043ade57-8838-4359-96dc-aa6dd0d8dfa8" containerName="cinder-api-log" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.172058 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="043ade57-8838-4359-96dc-aa6dd0d8dfa8" containerName="cinder-api-log" Jan 31 09:19:25 crc kubenswrapper[4783]: E0131 09:19:25.172079 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043ade57-8838-4359-96dc-aa6dd0d8dfa8" containerName="cinder-api" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.172086 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="043ade57-8838-4359-96dc-aa6dd0d8dfa8" containerName="cinder-api" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.173232 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="043ade57-8838-4359-96dc-aa6dd0d8dfa8" containerName="cinder-api" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.173259 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="043ade57-8838-4359-96dc-aa6dd0d8dfa8" containerName="cinder-api-log" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.175053 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.198947 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.199730 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.200936 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.220189 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.235748 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-config-data-custom\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.237495 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-scripts\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.237601 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-config-data\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.237628 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.237717 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.238240 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/efc715c4-8350-4307-91b8-d33c62513e41-etc-machine-id\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.238420 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krpb6\" (UniqueName: \"kubernetes.io/projected/efc715c4-8350-4307-91b8-d33c62513e41-kube-api-access-krpb6\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.238669 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efc715c4-8350-4307-91b8-d33c62513e41-logs\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.238800 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-public-tls-certs\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.312534 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.341398 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-scripts\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.341526 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-config-data\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.341605 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.341677 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.341767 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/efc715c4-8350-4307-91b8-d33c62513e41-etc-machine-id\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.341831 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krpb6\" (UniqueName: \"kubernetes.io/projected/efc715c4-8350-4307-91b8-d33c62513e41-kube-api-access-krpb6\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.341900 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efc715c4-8350-4307-91b8-d33c62513e41-logs\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.341964 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-public-tls-certs\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.342037 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-config-data-custom\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.343054 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/efc715c4-8350-4307-91b8-d33c62513e41-etc-machine-id\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.343435 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efc715c4-8350-4307-91b8-d33c62513e41-logs\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.348729 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-scripts\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.349801 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.351122 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-public-tls-certs\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.351624 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-config-data-custom\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.353000 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-config-data\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.353012 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/efc715c4-8350-4307-91b8-d33c62513e41-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.360435 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krpb6\" (UniqueName: \"kubernetes.io/projected/efc715c4-8350-4307-91b8-d33c62513e41-kube-api-access-krpb6\") pod \"cinder-api-0\" (UID: \"efc715c4-8350-4307-91b8-d33c62513e41\") " pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.493664 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.523734 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.661044 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="043ade57-8838-4359-96dc-aa6dd0d8dfa8" path="/var/lib/kubelet/pods/043ade57-8838-4359-96dc-aa6dd0d8dfa8/volumes" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.955245 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:19:25 crc kubenswrapper[4783]: W0131 09:19:25.968121 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefc715c4_8350_4307_91b8_d33c62513e41.slice/crio-4fb3930c109c7d75e7a5c213f32c5975ac6ebe970dbac2031a12eda84eecb52d WatchSource:0}: Error finding container 4fb3930c109c7d75e7a5c213f32c5975ac6ebe970dbac2031a12eda84eecb52d: Status 404 returned error can't find the container with id 4fb3930c109c7d75e7a5c213f32c5975ac6ebe970dbac2031a12eda84eecb52d Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.996955 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:25 crc kubenswrapper[4783]: I0131 09:19:25.997111 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.045512 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.094538 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"efc715c4-8350-4307-91b8-d33c62513e41","Type":"ContainerStarted","Data":"4fb3930c109c7d75e7a5c213f32c5975ac6ebe970dbac2031a12eda84eecb52d"} Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.097494 4783 generic.go:334] "Generic (PLEG): container finished" podID="f4f754dd-ba60-4f7b-b96b-ac8f2530250c" containerID="ae7fce32118e69b843e67e5267f1b57a0b020c45a6d104f24057490ed9b3003d" exitCode=0 Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.097556 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmfcr" event={"ID":"f4f754dd-ba60-4f7b-b96b-ac8f2530250c","Type":"ContainerDied","Data":"ae7fce32118e69b843e67e5267f1b57a0b020c45a6d104f24057490ed9b3003d"} Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.124396 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1651d956-f474-4825-a9a7-c9a350d3e2b3","Type":"ContainerStarted","Data":"57ba25d3a96f9f4dd05e25d5bc530bf8d9197ceb128f459ceda40b87a1acf8af"} Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.512704 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.570881 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-public-tls-certs\") pod \"4276e01a-227a-4370-8b9b-cfc5123aa13d\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.571149 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-config\") pod \"4276e01a-227a-4370-8b9b-cfc5123aa13d\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.571288 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-combined-ca-bundle\") pod \"4276e01a-227a-4370-8b9b-cfc5123aa13d\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.571340 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-httpd-config\") pod \"4276e01a-227a-4370-8b9b-cfc5123aa13d\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.571380 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-ovndb-tls-certs\") pod \"4276e01a-227a-4370-8b9b-cfc5123aa13d\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.571461 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl9lm\" (UniqueName: \"kubernetes.io/projected/4276e01a-227a-4370-8b9b-cfc5123aa13d-kube-api-access-cl9lm\") pod \"4276e01a-227a-4370-8b9b-cfc5123aa13d\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.571571 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-internal-tls-certs\") pod \"4276e01a-227a-4370-8b9b-cfc5123aa13d\" (UID: \"4276e01a-227a-4370-8b9b-cfc5123aa13d\") " Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.578128 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4276e01a-227a-4370-8b9b-cfc5123aa13d" (UID: "4276e01a-227a-4370-8b9b-cfc5123aa13d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.584656 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4276e01a-227a-4370-8b9b-cfc5123aa13d-kube-api-access-cl9lm" (OuterVolumeSpecName: "kube-api-access-cl9lm") pod "4276e01a-227a-4370-8b9b-cfc5123aa13d" (UID: "4276e01a-227a-4370-8b9b-cfc5123aa13d"). InnerVolumeSpecName "kube-api-access-cl9lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.637728 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4276e01a-227a-4370-8b9b-cfc5123aa13d" (UID: "4276e01a-227a-4370-8b9b-cfc5123aa13d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.645680 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4276e01a-227a-4370-8b9b-cfc5123aa13d" (UID: "4276e01a-227a-4370-8b9b-cfc5123aa13d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.676344 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-config" (OuterVolumeSpecName: "config") pod "4276e01a-227a-4370-8b9b-cfc5123aa13d" (UID: "4276e01a-227a-4370-8b9b-cfc5123aa13d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.695083 4783 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.695156 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.695251 4783 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.695263 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl9lm\" (UniqueName: \"kubernetes.io/projected/4276e01a-227a-4370-8b9b-cfc5123aa13d-kube-api-access-cl9lm\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.695274 4783 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.699635 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4276e01a-227a-4370-8b9b-cfc5123aa13d" (UID: "4276e01a-227a-4370-8b9b-cfc5123aa13d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.705909 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4276e01a-227a-4370-8b9b-cfc5123aa13d" (UID: "4276e01a-227a-4370-8b9b-cfc5123aa13d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.801770 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:26 crc kubenswrapper[4783]: I0131 09:19:26.801822 4783 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276e01a-227a-4370-8b9b-cfc5123aa13d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.063591 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.075394 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6644bf8978-q24zg" Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.147947 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f5ff596f4-ffmss"] Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.153513 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmfcr" event={"ID":"f4f754dd-ba60-4f7b-b96b-ac8f2530250c","Type":"ContainerStarted","Data":"5ec55dd2413d0eacf1ee06eda3ffc9c4be3e3c6d5215ce3d51a046c8144443e4"} Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.179242 4783 generic.go:334] "Generic (PLEG): container finished" podID="4276e01a-227a-4370-8b9b-cfc5123aa13d" containerID="0aaee8a39b35d3d99a54863e0fd33968badf4049ade7b35aaf9d948501d50975" exitCode=0 Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.179346 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69d7fc9755-j47fz" Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.182265 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d7fc9755-j47fz" event={"ID":"4276e01a-227a-4370-8b9b-cfc5123aa13d","Type":"ContainerDied","Data":"0aaee8a39b35d3d99a54863e0fd33968badf4049ade7b35aaf9d948501d50975"} Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.182419 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d7fc9755-j47fz" event={"ID":"4276e01a-227a-4370-8b9b-cfc5123aa13d","Type":"ContainerDied","Data":"1bc37110440c86d5cc116d30e5c29f9eb0e976e736f5032c587e80e0cac0b85f"} Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.182548 4783 scope.go:117] "RemoveContainer" containerID="63fb2d8722bb74c63b5eb99153805cee774c4fe20b37ba7479fec958c8e8da20" Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.199921 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dmfcr" podStartSLOduration=3.373565198 podStartE2EDuration="9.199909726s" podCreationTimestamp="2026-01-31 09:19:18 +0000 UTC" firstStartedPulling="2026-01-31 09:19:20.765016238 +0000 UTC m=+871.433699707" lastFinishedPulling="2026-01-31 09:19:26.591360768 +0000 UTC m=+877.260044235" observedRunningTime="2026-01-31 09:19:27.199522325 +0000 UTC m=+877.868205793" watchObservedRunningTime="2026-01-31 09:19:27.199909726 +0000 UTC m=+877.868593194" Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.227271 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1651d956-f474-4825-a9a7-c9a350d3e2b3","Type":"ContainerStarted","Data":"440261fa003f5b096693ea8d38f0321e683a20df2eeb513684ecfd57bfdf33fd"} Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.257245 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"efc715c4-8350-4307-91b8-d33c62513e41","Type":"ContainerStarted","Data":"04c189bddda97519560cdef22d1ca9750dd19b6ec1da09e419eacd051f35e3a4"} Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.257313 4783 scope.go:117] "RemoveContainer" containerID="0aaee8a39b35d3d99a54863e0fd33968badf4049ade7b35aaf9d948501d50975" Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.257913 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f5ff596f4-ffmss" podUID="8dda3593-0628-4253-995b-b662d252462e" containerName="horizon-log" containerID="cri-o://01d315f893e2e65f63bc946effea328252794d35866a0d1cd4cd689f197bbbb2" gracePeriod=30 Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.258219 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f5ff596f4-ffmss" podUID="8dda3593-0628-4253-995b-b662d252462e" containerName="horizon" containerID="cri-o://fdb0556ca619eb1273ab3de64979e36b7cd85f9ae88fac7da9dbe1278cce5458" gracePeriod=30 Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.314529 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-69d7fc9755-j47fz"] Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.315139 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-69d7fc9755-j47fz"] Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.355591 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.372203 4783 scope.go:117] "RemoveContainer" containerID="63fb2d8722bb74c63b5eb99153805cee774c4fe20b37ba7479fec958c8e8da20" Jan 31 09:19:27 crc kubenswrapper[4783]: E0131 09:19:27.372718 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63fb2d8722bb74c63b5eb99153805cee774c4fe20b37ba7479fec958c8e8da20\": container with ID starting with 63fb2d8722bb74c63b5eb99153805cee774c4fe20b37ba7479fec958c8e8da20 not found: ID does not exist" containerID="63fb2d8722bb74c63b5eb99153805cee774c4fe20b37ba7479fec958c8e8da20" Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.372777 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63fb2d8722bb74c63b5eb99153805cee774c4fe20b37ba7479fec958c8e8da20"} err="failed to get container status \"63fb2d8722bb74c63b5eb99153805cee774c4fe20b37ba7479fec958c8e8da20\": rpc error: code = NotFound desc = could not find container \"63fb2d8722bb74c63b5eb99153805cee774c4fe20b37ba7479fec958c8e8da20\": container with ID starting with 63fb2d8722bb74c63b5eb99153805cee774c4fe20b37ba7479fec958c8e8da20 not found: ID does not exist" Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.372807 4783 scope.go:117] "RemoveContainer" containerID="0aaee8a39b35d3d99a54863e0fd33968badf4049ade7b35aaf9d948501d50975" Jan 31 09:19:27 crc kubenswrapper[4783]: E0131 09:19:27.381761 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aaee8a39b35d3d99a54863e0fd33968badf4049ade7b35aaf9d948501d50975\": container with ID starting with 0aaee8a39b35d3d99a54863e0fd33968badf4049ade7b35aaf9d948501d50975 not found: ID does not exist" containerID="0aaee8a39b35d3d99a54863e0fd33968badf4049ade7b35aaf9d948501d50975" Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.381789 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aaee8a39b35d3d99a54863e0fd33968badf4049ade7b35aaf9d948501d50975"} err="failed to get container status \"0aaee8a39b35d3d99a54863e0fd33968badf4049ade7b35aaf9d948501d50975\": rpc error: code = NotFound desc = could not find container \"0aaee8a39b35d3d99a54863e0fd33968badf4049ade7b35aaf9d948501d50975\": container with ID starting with 0aaee8a39b35d3d99a54863e0fd33968badf4049ade7b35aaf9d948501d50975 not found: ID does not exist" Jan 31 09:19:27 crc kubenswrapper[4783]: I0131 09:19:27.667214 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4276e01a-227a-4370-8b9b-cfc5123aa13d" path="/var/lib/kubelet/pods/4276e01a-227a-4370-8b9b-cfc5123aa13d/volumes" Jan 31 09:19:28 crc kubenswrapper[4783]: I0131 09:19:28.265767 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"efc715c4-8350-4307-91b8-d33c62513e41","Type":"ContainerStarted","Data":"da7519874c2086fe280a938b51ccb653f4b0833d393071c2df886d413f6bd96a"} Jan 31 09:19:28 crc kubenswrapper[4783]: I0131 09:19:28.265916 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 31 09:19:28 crc kubenswrapper[4783]: I0131 09:19:28.292122 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.292104193 podStartE2EDuration="3.292104193s" podCreationTimestamp="2026-01-31 09:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:28.286544883 +0000 UTC m=+878.955228351" watchObservedRunningTime="2026-01-31 09:19:28.292104193 +0000 UTC m=+878.960787660" Jan 31 09:19:28 crc kubenswrapper[4783]: I0131 09:19:28.718008 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:28 crc kubenswrapper[4783]: I0131 09:19:28.733319 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.135656 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.135713 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.255876 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7k8d2"] Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.281747 4783 generic.go:334] "Generic (PLEG): container finished" podID="89312075-7597-4743-b92c-58411b26f1ec" containerID="5837f58fda0b78176bc8be8853f671a14f27ac76e9d7dad37189349edab34ccd" exitCode=137 Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.281779 4783 generic.go:334] "Generic (PLEG): container finished" podID="89312075-7597-4743-b92c-58411b26f1ec" containerID="f463ce51fb0c33bcc2b49a6a026db8c692b2c493624d349193048fd4dc640e47" exitCode=137 Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.281823 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5989bc564f-6q4l6" event={"ID":"89312075-7597-4743-b92c-58411b26f1ec","Type":"ContainerDied","Data":"5837f58fda0b78176bc8be8853f671a14f27ac76e9d7dad37189349edab34ccd"} Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.281849 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5989bc564f-6q4l6" event={"ID":"89312075-7597-4743-b92c-58411b26f1ec","Type":"ContainerDied","Data":"f463ce51fb0c33bcc2b49a6a026db8c692b2c493624d349193048fd4dc640e47"} Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.286613 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1651d956-f474-4825-a9a7-c9a350d3e2b3","Type":"ContainerStarted","Data":"0eabe8be1dd00d71d8bb61e0590a11c060639b160b551043f0e16e7f720243de"} Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.287533 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.291297 4783 generic.go:334] "Generic (PLEG): container finished" podID="349997ee-6053-4d85-8eae-1d4adf3b347e" containerID="408c4aa1ce013f629cebff83c2e2cac264dcd1ec753f8bffec9f7d71def3f984" exitCode=137 Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.291319 4783 generic.go:334] "Generic (PLEG): container finished" podID="349997ee-6053-4d85-8eae-1d4adf3b347e" containerID="c0759064018d6758926b71f9e8597f9a998bd45b9e23a808b5eaa8ff4d6aeef6" exitCode=137 Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.291348 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cc66f58c7-79z2q" event={"ID":"349997ee-6053-4d85-8eae-1d4adf3b347e","Type":"ContainerDied","Data":"408c4aa1ce013f629cebff83c2e2cac264dcd1ec753f8bffec9f7d71def3f984"} Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.291366 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cc66f58c7-79z2q" event={"ID":"349997ee-6053-4d85-8eae-1d4adf3b347e","Type":"ContainerDied","Data":"c0759064018d6758926b71f9e8597f9a998bd45b9e23a808b5eaa8ff4d6aeef6"} Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.300380 4783 generic.go:334] "Generic (PLEG): container finished" podID="e73bb130-1464-433b-b34d-4af489f73b46" containerID="fb71f926c1a6ece1f05a32110e077eaf0f05f8ec37d922bbb2a1088d409bd733" exitCode=137 Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.300420 4783 generic.go:334] "Generic (PLEG): container finished" podID="e73bb130-1464-433b-b34d-4af489f73b46" containerID="9bdb2e062a7ad48e696ef622530639ab29e48c755d8a1d9c67c9433bfcc380cd" exitCode=137 Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.300574 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cd96d8cbc-jvn7k" event={"ID":"e73bb130-1464-433b-b34d-4af489f73b46","Type":"ContainerDied","Data":"fb71f926c1a6ece1f05a32110e077eaf0f05f8ec37d922bbb2a1088d409bd733"} Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.300609 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7k8d2" podUID="c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c" containerName="registry-server" containerID="cri-o://92372dd5825e3ccfb353e71d962aa07df39af0b357f43168a070b8efc33bafaa" gracePeriod=2 Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.300641 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cd96d8cbc-jvn7k" event={"ID":"e73bb130-1464-433b-b34d-4af489f73b46","Type":"ContainerDied","Data":"9bdb2e062a7ad48e696ef622530639ab29e48c755d8a1d9c67c9433bfcc380cd"} Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.313241 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.916962566 podStartE2EDuration="7.313229055s" podCreationTimestamp="2026-01-31 09:19:22 +0000 UTC" firstStartedPulling="2026-01-31 09:19:23.94479337 +0000 UTC m=+874.613476839" lastFinishedPulling="2026-01-31 09:19:28.341059859 +0000 UTC m=+879.009743328" observedRunningTime="2026-01-31 09:19:29.310666886 +0000 UTC m=+879.979350354" watchObservedRunningTime="2026-01-31 09:19:29.313229055 +0000 UTC m=+879.981912523" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.675658 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.753816 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.788931 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.793286 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.801311 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.815578 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.895946 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmzhz\" (UniqueName: \"kubernetes.io/projected/e73bb130-1464-433b-b34d-4af489f73b46-kube-api-access-nmzhz\") pod \"e73bb130-1464-433b-b34d-4af489f73b46\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.896150 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89312075-7597-4743-b92c-58411b26f1ec-scripts\") pod \"89312075-7597-4743-b92c-58411b26f1ec\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.896230 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e73bb130-1464-433b-b34d-4af489f73b46-horizon-secret-key\") pod \"e73bb130-1464-433b-b34d-4af489f73b46\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.896276 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e73bb130-1464-433b-b34d-4af489f73b46-logs\") pod \"e73bb130-1464-433b-b34d-4af489f73b46\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.896402 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89312075-7597-4743-b92c-58411b26f1ec-config-data\") pod \"89312075-7597-4743-b92c-58411b26f1ec\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.896427 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e73bb130-1464-433b-b34d-4af489f73b46-scripts\") pod \"e73bb130-1464-433b-b34d-4af489f73b46\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.896456 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxpfb\" (UniqueName: \"kubernetes.io/projected/89312075-7597-4743-b92c-58411b26f1ec-kube-api-access-mxpfb\") pod \"89312075-7597-4743-b92c-58411b26f1ec\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.896482 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89312075-7597-4743-b92c-58411b26f1ec-logs\") pod \"89312075-7597-4743-b92c-58411b26f1ec\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.896574 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89312075-7597-4743-b92c-58411b26f1ec-horizon-secret-key\") pod \"89312075-7597-4743-b92c-58411b26f1ec\" (UID: \"89312075-7597-4743-b92c-58411b26f1ec\") " Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.896606 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e73bb130-1464-433b-b34d-4af489f73b46-config-data\") pod \"e73bb130-1464-433b-b34d-4af489f73b46\" (UID: \"e73bb130-1464-433b-b34d-4af489f73b46\") " Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.897715 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89312075-7597-4743-b92c-58411b26f1ec-logs" (OuterVolumeSpecName: "logs") pod "89312075-7597-4743-b92c-58411b26f1ec" (UID: "89312075-7597-4743-b92c-58411b26f1ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.903083 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e73bb130-1464-433b-b34d-4af489f73b46-logs" (OuterVolumeSpecName: "logs") pod "e73bb130-1464-433b-b34d-4af489f73b46" (UID: "e73bb130-1464-433b-b34d-4af489f73b46"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.906647 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.907409 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e73bb130-1464-433b-b34d-4af489f73b46-kube-api-access-nmzhz" (OuterVolumeSpecName: "kube-api-access-nmzhz") pod "e73bb130-1464-433b-b34d-4af489f73b46" (UID: "e73bb130-1464-433b-b34d-4af489f73b46"). InnerVolumeSpecName "kube-api-access-nmzhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.910195 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e73bb130-1464-433b-b34d-4af489f73b46-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e73bb130-1464-433b-b34d-4af489f73b46" (UID: "e73bb130-1464-433b-b34d-4af489f73b46"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.923253 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89312075-7597-4743-b92c-58411b26f1ec-kube-api-access-mxpfb" (OuterVolumeSpecName: "kube-api-access-mxpfb") pod "89312075-7597-4743-b92c-58411b26f1ec" (UID: "89312075-7597-4743-b92c-58411b26f1ec"). InnerVolumeSpecName "kube-api-access-mxpfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.923318 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89312075-7597-4743-b92c-58411b26f1ec-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "89312075-7597-4743-b92c-58411b26f1ec" (UID: "89312075-7597-4743-b92c-58411b26f1ec"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.927930 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89312075-7597-4743-b92c-58411b26f1ec-config-data" (OuterVolumeSpecName: "config-data") pod "89312075-7597-4743-b92c-58411b26f1ec" (UID: "89312075-7597-4743-b92c-58411b26f1ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.932844 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e73bb130-1464-433b-b34d-4af489f73b46-config-data" (OuterVolumeSpecName: "config-data") pod "e73bb130-1464-433b-b34d-4af489f73b46" (UID: "e73bb130-1464-433b-b34d-4af489f73b46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.938558 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e73bb130-1464-433b-b34d-4af489f73b46-scripts" (OuterVolumeSpecName: "scripts") pod "e73bb130-1464-433b-b34d-4af489f73b46" (UID: "e73bb130-1464-433b-b34d-4af489f73b46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.951290 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89312075-7597-4743-b92c-58411b26f1ec-scripts" (OuterVolumeSpecName: "scripts") pod "89312075-7597-4743-b92c-58411b26f1ec" (UID: "89312075-7597-4743-b92c-58411b26f1ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.965114 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.998791 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/349997ee-6053-4d85-8eae-1d4adf3b347e-config-data\") pod \"349997ee-6053-4d85-8eae-1d4adf3b347e\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.998839 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/349997ee-6053-4d85-8eae-1d4adf3b347e-horizon-secret-key\") pod \"349997ee-6053-4d85-8eae-1d4adf3b347e\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.998885 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2t96\" (UniqueName: \"kubernetes.io/projected/349997ee-6053-4d85-8eae-1d4adf3b347e-kube-api-access-d2t96\") pod \"349997ee-6053-4d85-8eae-1d4adf3b347e\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.998916 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/349997ee-6053-4d85-8eae-1d4adf3b347e-scripts\") pod \"349997ee-6053-4d85-8eae-1d4adf3b347e\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.999099 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/349997ee-6053-4d85-8eae-1d4adf3b347e-logs\") pod \"349997ee-6053-4d85-8eae-1d4adf3b347e\" (UID: \"349997ee-6053-4d85-8eae-1d4adf3b347e\") " Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.999680 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89312075-7597-4743-b92c-58411b26f1ec-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.999697 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e73bb130-1464-433b-b34d-4af489f73b46-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.999707 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxpfb\" (UniqueName: \"kubernetes.io/projected/89312075-7597-4743-b92c-58411b26f1ec-kube-api-access-mxpfb\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.999718 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89312075-7597-4743-b92c-58411b26f1ec-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.999727 4783 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89312075-7597-4743-b92c-58411b26f1ec-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.999737 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e73bb130-1464-433b-b34d-4af489f73b46-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.999746 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmzhz\" (UniqueName: \"kubernetes.io/projected/e73bb130-1464-433b-b34d-4af489f73b46-kube-api-access-nmzhz\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.999754 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89312075-7597-4743-b92c-58411b26f1ec-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.999763 4783 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e73bb130-1464-433b-b34d-4af489f73b46-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:29 crc kubenswrapper[4783]: I0131 09:19:29.999771 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e73bb130-1464-433b-b34d-4af489f73b46-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.000081 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/349997ee-6053-4d85-8eae-1d4adf3b347e-logs" (OuterVolumeSpecName: "logs") pod "349997ee-6053-4d85-8eae-1d4adf3b347e" (UID: "349997ee-6053-4d85-8eae-1d4adf3b347e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.005212 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/349997ee-6053-4d85-8eae-1d4adf3b347e-kube-api-access-d2t96" (OuterVolumeSpecName: "kube-api-access-d2t96") pod "349997ee-6053-4d85-8eae-1d4adf3b347e" (UID: "349997ee-6053-4d85-8eae-1d4adf3b347e"). InnerVolumeSpecName "kube-api-access-d2t96". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.006793 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/349997ee-6053-4d85-8eae-1d4adf3b347e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "349997ee-6053-4d85-8eae-1d4adf3b347e" (UID: "349997ee-6053-4d85-8eae-1d4adf3b347e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.034732 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/349997ee-6053-4d85-8eae-1d4adf3b347e-scripts" (OuterVolumeSpecName: "scripts") pod "349997ee-6053-4d85-8eae-1d4adf3b347e" (UID: "349997ee-6053-4d85-8eae-1d4adf3b347e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.035003 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-56zzg"] Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.035711 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" podUID="f7684372-8d95-457c-b0a7-a58bb2cbf149" containerName="dnsmasq-dns" containerID="cri-o://184b1f0e049c40968bb5d4e9e2e7f3bf901d04dfae41d532236bdd87f80082af" gracePeriod=10 Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.048128 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/349997ee-6053-4d85-8eae-1d4adf3b347e-config-data" (OuterVolumeSpecName: "config-data") pod "349997ee-6053-4d85-8eae-1d4adf3b347e" (UID: "349997ee-6053-4d85-8eae-1d4adf3b347e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.100968 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-catalog-content\") pod \"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c\" (UID: \"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c\") " Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.101010 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdk6z\" (UniqueName: \"kubernetes.io/projected/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-kube-api-access-sdk6z\") pod \"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c\" (UID: \"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c\") " Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.101186 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-utilities\") pod \"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c\" (UID: \"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c\") " Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.101821 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/349997ee-6053-4d85-8eae-1d4adf3b347e-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.101834 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/349997ee-6053-4d85-8eae-1d4adf3b347e-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.101843 4783 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/349997ee-6053-4d85-8eae-1d4adf3b347e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.101851 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2t96\" (UniqueName: \"kubernetes.io/projected/349997ee-6053-4d85-8eae-1d4adf3b347e-kube-api-access-d2t96\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.101860 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/349997ee-6053-4d85-8eae-1d4adf3b347e-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.104631 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-utilities" (OuterVolumeSpecName: "utilities") pod "c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c" (UID: "c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.108309 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-kube-api-access-sdk6z" (OuterVolumeSpecName: "kube-api-access-sdk6z") pod "c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c" (UID: "c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c"). InnerVolumeSpecName "kube-api-access-sdk6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.128486 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.166964 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c" (UID: "c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.183505 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dmfcr" podUID="f4f754dd-ba60-4f7b-b96b-ac8f2530250c" containerName="registry-server" probeResult="failure" output=< Jan 31 09:19:30 crc kubenswrapper[4783]: timeout: failed to connect service ":50051" within 1s Jan 31 09:19:30 crc kubenswrapper[4783]: > Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.203802 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.203835 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdk6z\" (UniqueName: \"kubernetes.io/projected/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-kube-api-access-sdk6z\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.203849 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.333777 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cc66f58c7-79z2q" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.333758 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cc66f58c7-79z2q" event={"ID":"349997ee-6053-4d85-8eae-1d4adf3b347e","Type":"ContainerDied","Data":"bc14e919f8b3449414fb735c730884d8728eab6a3837b28384ad0a32a307afa1"} Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.333840 4783 scope.go:117] "RemoveContainer" containerID="408c4aa1ce013f629cebff83c2e2cac264dcd1ec753f8bffec9f7d71def3f984" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.346323 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cd96d8cbc-jvn7k" event={"ID":"e73bb130-1464-433b-b34d-4af489f73b46","Type":"ContainerDied","Data":"17312f09b941fce56994b0cd5dc31766ee5dbacb8bc604072156d90f5067b760"} Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.346392 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cd96d8cbc-jvn7k" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.369751 4783 generic.go:334] "Generic (PLEG): container finished" podID="f7684372-8d95-457c-b0a7-a58bb2cbf149" containerID="184b1f0e049c40968bb5d4e9e2e7f3bf901d04dfae41d532236bdd87f80082af" exitCode=0 Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.369862 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" event={"ID":"f7684372-8d95-457c-b0a7-a58bb2cbf149","Type":"ContainerDied","Data":"184b1f0e049c40968bb5d4e9e2e7f3bf901d04dfae41d532236bdd87f80082af"} Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.382675 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-cc66f58c7-79z2q"] Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.388778 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-cc66f58c7-79z2q"] Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.414641 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5989bc564f-6q4l6" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.414701 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5989bc564f-6q4l6" event={"ID":"89312075-7597-4743-b92c-58411b26f1ec","Type":"ContainerDied","Data":"7244e6cc3caf7fc0265490fefbf19c0b20e802ba037473ce8073f3ee29435ad3"} Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.417174 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cd96d8cbc-jvn7k"] Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.434956 4783 generic.go:334] "Generic (PLEG): container finished" podID="c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c" containerID="92372dd5825e3ccfb353e71d962aa07df39af0b357f43168a070b8efc33bafaa" exitCode=0 Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.435177 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k8d2" event={"ID":"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c","Type":"ContainerDied","Data":"92372dd5825e3ccfb353e71d962aa07df39af0b357f43168a070b8efc33bafaa"} Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.435222 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k8d2" event={"ID":"c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c","Type":"ContainerDied","Data":"a155f2f40a6d62b3aca390353ce90204bc0593d6690e09d3cda7c183316d6693"} Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.435892 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7k8d2" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.459938 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5cd96d8cbc-jvn7k"] Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.480510 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.518434 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5989bc564f-6q4l6"] Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.524430 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5989bc564f-6q4l6"] Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.536210 4783 scope.go:117] "RemoveContainer" containerID="c0759064018d6758926b71f9e8597f9a998bd45b9e23a808b5eaa8ff4d6aeef6" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.555801 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7k8d2"] Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.570460 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7k8d2"] Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.613652 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.626549 4783 scope.go:117] "RemoveContainer" containerID="fb71f926c1a6ece1f05a32110e077eaf0f05f8ec37d922bbb2a1088d409bd733" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.723975 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-config\") pod \"f7684372-8d95-457c-b0a7-a58bb2cbf149\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.724076 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58kqc\" (UniqueName: \"kubernetes.io/projected/f7684372-8d95-457c-b0a7-a58bb2cbf149-kube-api-access-58kqc\") pod \"f7684372-8d95-457c-b0a7-a58bb2cbf149\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.724197 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-ovsdbserver-nb\") pod \"f7684372-8d95-457c-b0a7-a58bb2cbf149\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.724217 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-dns-svc\") pod \"f7684372-8d95-457c-b0a7-a58bb2cbf149\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.724267 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-ovsdbserver-sb\") pod \"f7684372-8d95-457c-b0a7-a58bb2cbf149\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.724500 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-dns-swift-storage-0\") pod \"f7684372-8d95-457c-b0a7-a58bb2cbf149\" (UID: \"f7684372-8d95-457c-b0a7-a58bb2cbf149\") " Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.754095 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7684372-8d95-457c-b0a7-a58bb2cbf149-kube-api-access-58kqc" (OuterVolumeSpecName: "kube-api-access-58kqc") pod "f7684372-8d95-457c-b0a7-a58bb2cbf149" (UID: "f7684372-8d95-457c-b0a7-a58bb2cbf149"). InnerVolumeSpecName "kube-api-access-58kqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.771834 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f7684372-8d95-457c-b0a7-a58bb2cbf149" (UID: "f7684372-8d95-457c-b0a7-a58bb2cbf149"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.795281 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f7684372-8d95-457c-b0a7-a58bb2cbf149" (UID: "f7684372-8d95-457c-b0a7-a58bb2cbf149"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.808478 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-config" (OuterVolumeSpecName: "config") pod "f7684372-8d95-457c-b0a7-a58bb2cbf149" (UID: "f7684372-8d95-457c-b0a7-a58bb2cbf149"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.826755 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f7684372-8d95-457c-b0a7-a58bb2cbf149" (UID: "f7684372-8d95-457c-b0a7-a58bb2cbf149"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.829272 4783 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.829318 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.829331 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58kqc\" (UniqueName: \"kubernetes.io/projected/f7684372-8d95-457c-b0a7-a58bb2cbf149-kube-api-access-58kqc\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.829343 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.850353 4783 scope.go:117] "RemoveContainer" containerID="9bdb2e062a7ad48e696ef622530639ab29e48c755d8a1d9c67c9433bfcc380cd" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.864369 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f7684372-8d95-457c-b0a7-a58bb2cbf149" (UID: "f7684372-8d95-457c-b0a7-a58bb2cbf149"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.898558 4783 scope.go:117] "RemoveContainer" containerID="5837f58fda0b78176bc8be8853f671a14f27ac76e9d7dad37189349edab34ccd" Jan 31 09:19:30 crc kubenswrapper[4783]: E0131 09:19:30.901261 4783 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode52e03f3_fa41_487d_affe_89222406f4bb.slice/crio-84fdab98a2affa05717215feb4e4383412a44843ea778db228bbfc3433d763c6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca7e6f7a_4b59_42fd_9ef2_4f761e2d0af9.slice\": RecentStats: unable to find data in memory cache]" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.937355 4783 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:30 crc kubenswrapper[4783]: I0131 09:19:30.937390 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7684372-8d95-457c-b0a7-a58bb2cbf149-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.049544 4783 scope.go:117] "RemoveContainer" containerID="f463ce51fb0c33bcc2b49a6a026db8c692b2c493624d349193048fd4dc640e47" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.069899 4783 scope.go:117] "RemoveContainer" containerID="92372dd5825e3ccfb353e71d962aa07df39af0b357f43168a070b8efc33bafaa" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.084934 4783 scope.go:117] "RemoveContainer" containerID="4b60300b010e61250925218d84bf08058c2795d1d60aa931f0cd048ffc9024c5" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.104328 4783 scope.go:117] "RemoveContainer" containerID="f9a65763325f9fa5fb8673ebfc7c3c83f8e8a5e5fc58da30c9e46367fd5e18a6" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.125831 4783 scope.go:117] "RemoveContainer" containerID="92372dd5825e3ccfb353e71d962aa07df39af0b357f43168a070b8efc33bafaa" Jan 31 09:19:31 crc kubenswrapper[4783]: E0131 09:19:31.126174 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92372dd5825e3ccfb353e71d962aa07df39af0b357f43168a070b8efc33bafaa\": container with ID starting with 92372dd5825e3ccfb353e71d962aa07df39af0b357f43168a070b8efc33bafaa not found: ID does not exist" containerID="92372dd5825e3ccfb353e71d962aa07df39af0b357f43168a070b8efc33bafaa" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.126219 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92372dd5825e3ccfb353e71d962aa07df39af0b357f43168a070b8efc33bafaa"} err="failed to get container status \"92372dd5825e3ccfb353e71d962aa07df39af0b357f43168a070b8efc33bafaa\": rpc error: code = NotFound desc = could not find container \"92372dd5825e3ccfb353e71d962aa07df39af0b357f43168a070b8efc33bafaa\": container with ID starting with 92372dd5825e3ccfb353e71d962aa07df39af0b357f43168a070b8efc33bafaa not found: ID does not exist" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.126243 4783 scope.go:117] "RemoveContainer" containerID="4b60300b010e61250925218d84bf08058c2795d1d60aa931f0cd048ffc9024c5" Jan 31 09:19:31 crc kubenswrapper[4783]: E0131 09:19:31.126615 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b60300b010e61250925218d84bf08058c2795d1d60aa931f0cd048ffc9024c5\": container with ID starting with 4b60300b010e61250925218d84bf08058c2795d1d60aa931f0cd048ffc9024c5 not found: ID does not exist" containerID="4b60300b010e61250925218d84bf08058c2795d1d60aa931f0cd048ffc9024c5" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.126639 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b60300b010e61250925218d84bf08058c2795d1d60aa931f0cd048ffc9024c5"} err="failed to get container status \"4b60300b010e61250925218d84bf08058c2795d1d60aa931f0cd048ffc9024c5\": rpc error: code = NotFound desc = could not find container \"4b60300b010e61250925218d84bf08058c2795d1d60aa931f0cd048ffc9024c5\": container with ID starting with 4b60300b010e61250925218d84bf08058c2795d1d60aa931f0cd048ffc9024c5 not found: ID does not exist" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.126653 4783 scope.go:117] "RemoveContainer" containerID="f9a65763325f9fa5fb8673ebfc7c3c83f8e8a5e5fc58da30c9e46367fd5e18a6" Jan 31 09:19:31 crc kubenswrapper[4783]: E0131 09:19:31.126929 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9a65763325f9fa5fb8673ebfc7c3c83f8e8a5e5fc58da30c9e46367fd5e18a6\": container with ID starting with f9a65763325f9fa5fb8673ebfc7c3c83f8e8a5e5fc58da30c9e46367fd5e18a6 not found: ID does not exist" containerID="f9a65763325f9fa5fb8673ebfc7c3c83f8e8a5e5fc58da30c9e46367fd5e18a6" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.126962 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a65763325f9fa5fb8673ebfc7c3c83f8e8a5e5fc58da30c9e46367fd5e18a6"} err="failed to get container status \"f9a65763325f9fa5fb8673ebfc7c3c83f8e8a5e5fc58da30c9e46367fd5e18a6\": rpc error: code = NotFound desc = could not find container \"f9a65763325f9fa5fb8673ebfc7c3c83f8e8a5e5fc58da30c9e46367fd5e18a6\": container with ID starting with f9a65763325f9fa5fb8673ebfc7c3c83f8e8a5e5fc58da30c9e46367fd5e18a6 not found: ID does not exist" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.448880 4783 generic.go:334] "Generic (PLEG): container finished" podID="8dda3593-0628-4253-995b-b662d252462e" containerID="fdb0556ca619eb1273ab3de64979e36b7cd85f9ae88fac7da9dbe1278cce5458" exitCode=0 Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.448951 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f5ff596f4-ffmss" event={"ID":"8dda3593-0628-4253-995b-b662d252462e","Type":"ContainerDied","Data":"fdb0556ca619eb1273ab3de64979e36b7cd85f9ae88fac7da9dbe1278cce5458"} Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.465912 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" event={"ID":"f7684372-8d95-457c-b0a7-a58bb2cbf149","Type":"ContainerDied","Data":"be6df1a540b61707e1f16c5a78fe91cf64c57fd14bf22e8ff7fe728f66919976"} Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.466073 4783 scope.go:117] "RemoveContainer" containerID="184b1f0e049c40968bb5d4e9e2e7f3bf901d04dfae41d532236bdd87f80082af" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.466175 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b557cb95-3ddb-4f51-857f-7e044b7975f3" containerName="probe" containerID="cri-o://98cbff8088c837cfc6cab464192835263274b4aa0653a5e965292b0636b7db5a" gracePeriod=30 Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.465963 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-56zzg" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.466090 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b557cb95-3ddb-4f51-857f-7e044b7975f3" containerName="cinder-scheduler" containerID="cri-o://b619429584aed0ba814e6d1fddab79ace3d922033a55f1aa9e0c15e390fd66fc" gracePeriod=30 Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.486790 4783 scope.go:117] "RemoveContainer" containerID="e461a5b229024f7277c023c14030616b12adaa3ce920bcf93ee57a15d8a92425" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.497817 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-56zzg"] Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.507303 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-56zzg"] Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.656489 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="349997ee-6053-4d85-8eae-1d4adf3b347e" path="/var/lib/kubelet/pods/349997ee-6053-4d85-8eae-1d4adf3b347e/volumes" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.657148 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89312075-7597-4743-b92c-58411b26f1ec" path="/var/lib/kubelet/pods/89312075-7597-4743-b92c-58411b26f1ec/volumes" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.657748 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c" path="/var/lib/kubelet/pods/c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c/volumes" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.658909 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e73bb130-1464-433b-b34d-4af489f73b46" path="/var/lib/kubelet/pods/e73bb130-1464-433b-b34d-4af489f73b46/volumes" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.659475 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7684372-8d95-457c-b0a7-a58bb2cbf149" path="/var/lib/kubelet/pods/f7684372-8d95-457c-b0a7-a58bb2cbf149/volumes" Jan 31 09:19:31 crc kubenswrapper[4783]: I0131 09:19:31.805740 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:32 crc kubenswrapper[4783]: I0131 09:19:32.016216 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7994d94564-47gt2" Jan 31 09:19:32 crc kubenswrapper[4783]: I0131 09:19:32.069520 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-676f7f866-qr2ck"] Jan 31 09:19:32 crc kubenswrapper[4783]: I0131 09:19:32.069862 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-676f7f866-qr2ck" podUID="5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" containerName="barbican-api-log" containerID="cri-o://69cba238106285873331f2043c781dbc4270d7edf1f10b318ab5d42268a7cf29" gracePeriod=30 Jan 31 09:19:32 crc kubenswrapper[4783]: I0131 09:19:32.070004 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-676f7f866-qr2ck" podUID="5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" containerName="barbican-api" containerID="cri-o://0e192ff1a78289e7a4877820e9e898abbbee52d93fb499e3ba8df5ce67fc2553" gracePeriod=30 Jan 31 09:19:32 crc kubenswrapper[4783]: I0131 09:19:32.485311 4783 generic.go:334] "Generic (PLEG): container finished" podID="5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" containerID="69cba238106285873331f2043c781dbc4270d7edf1f10b318ab5d42268a7cf29" exitCode=143 Jan 31 09:19:32 crc kubenswrapper[4783]: I0131 09:19:32.485538 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-676f7f866-qr2ck" event={"ID":"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa","Type":"ContainerDied","Data":"69cba238106285873331f2043c781dbc4270d7edf1f10b318ab5d42268a7cf29"} Jan 31 09:19:32 crc kubenswrapper[4783]: I0131 09:19:32.488019 4783 generic.go:334] "Generic (PLEG): container finished" podID="b557cb95-3ddb-4f51-857f-7e044b7975f3" containerID="98cbff8088c837cfc6cab464192835263274b4aa0653a5e965292b0636b7db5a" exitCode=0 Jan 31 09:19:32 crc kubenswrapper[4783]: I0131 09:19:32.488095 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b557cb95-3ddb-4f51-857f-7e044b7975f3","Type":"ContainerDied","Data":"98cbff8088c837cfc6cab464192835263274b4aa0653a5e965292b0636b7db5a"} Jan 31 09:19:32 crc kubenswrapper[4783]: I0131 09:19:32.776879 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-f5ff596f4-ffmss" podUID="8dda3593-0628-4253-995b-b662d252462e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.045943 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vxv6q"] Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.046471 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vxv6q" podUID="43fedf89-07db-4c75-a8c3-9f04bbc9c6f7" containerName="registry-server" containerID="cri-o://101b1e86217c8d0db3be0e3180dcaf21e282a96c3889e1987c56cbde4498d6f2" gracePeriod=2 Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.470676 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.501828 4783 generic.go:334] "Generic (PLEG): container finished" podID="43fedf89-07db-4c75-a8c3-9f04bbc9c6f7" containerID="101b1e86217c8d0db3be0e3180dcaf21e282a96c3889e1987c56cbde4498d6f2" exitCode=0 Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.501907 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vxv6q" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.501903 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vxv6q" event={"ID":"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7","Type":"ContainerDied","Data":"101b1e86217c8d0db3be0e3180dcaf21e282a96c3889e1987c56cbde4498d6f2"} Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.502010 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vxv6q" event={"ID":"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7","Type":"ContainerDied","Data":"a6a89c35eb100b12139c8a66a94e0e405fc063ba66570d656b0a34a375689680"} Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.502047 4783 scope.go:117] "RemoveContainer" containerID="101b1e86217c8d0db3be0e3180dcaf21e282a96c3889e1987c56cbde4498d6f2" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.528777 4783 scope.go:117] "RemoveContainer" containerID="656e640701a9b950a553e031f4f33c1c42756be10e2b9a83921072ba8d7687a0" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.571405 4783 scope.go:117] "RemoveContainer" containerID="d98bf55b9245bf97f7d7575fec0f8ca9e0b1e81088b011780ac993458e19cdc3" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.591132 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-697wt\" (UniqueName: \"kubernetes.io/projected/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-kube-api-access-697wt\") pod \"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7\" (UID: \"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7\") " Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.591527 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-utilities\") pod \"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7\" (UID: \"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7\") " Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.591652 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-catalog-content\") pod \"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7\" (UID: \"43fedf89-07db-4c75-a8c3-9f04bbc9c6f7\") " Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.592658 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-utilities" (OuterVolumeSpecName: "utilities") pod "43fedf89-07db-4c75-a8c3-9f04bbc9c6f7" (UID: "43fedf89-07db-4c75-a8c3-9f04bbc9c6f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.597791 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-kube-api-access-697wt" (OuterVolumeSpecName: "kube-api-access-697wt") pod "43fedf89-07db-4c75-a8c3-9f04bbc9c6f7" (UID: "43fedf89-07db-4c75-a8c3-9f04bbc9c6f7"). InnerVolumeSpecName "kube-api-access-697wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.614496 4783 scope.go:117] "RemoveContainer" containerID="101b1e86217c8d0db3be0e3180dcaf21e282a96c3889e1987c56cbde4498d6f2" Jan 31 09:19:33 crc kubenswrapper[4783]: E0131 09:19:33.614904 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"101b1e86217c8d0db3be0e3180dcaf21e282a96c3889e1987c56cbde4498d6f2\": container with ID starting with 101b1e86217c8d0db3be0e3180dcaf21e282a96c3889e1987c56cbde4498d6f2 not found: ID does not exist" containerID="101b1e86217c8d0db3be0e3180dcaf21e282a96c3889e1987c56cbde4498d6f2" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.614944 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"101b1e86217c8d0db3be0e3180dcaf21e282a96c3889e1987c56cbde4498d6f2"} err="failed to get container status \"101b1e86217c8d0db3be0e3180dcaf21e282a96c3889e1987c56cbde4498d6f2\": rpc error: code = NotFound desc = could not find container \"101b1e86217c8d0db3be0e3180dcaf21e282a96c3889e1987c56cbde4498d6f2\": container with ID starting with 101b1e86217c8d0db3be0e3180dcaf21e282a96c3889e1987c56cbde4498d6f2 not found: ID does not exist" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.614969 4783 scope.go:117] "RemoveContainer" containerID="656e640701a9b950a553e031f4f33c1c42756be10e2b9a83921072ba8d7687a0" Jan 31 09:19:33 crc kubenswrapper[4783]: E0131 09:19:33.615815 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"656e640701a9b950a553e031f4f33c1c42756be10e2b9a83921072ba8d7687a0\": container with ID starting with 656e640701a9b950a553e031f4f33c1c42756be10e2b9a83921072ba8d7687a0 not found: ID does not exist" containerID="656e640701a9b950a553e031f4f33c1c42756be10e2b9a83921072ba8d7687a0" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.615850 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656e640701a9b950a553e031f4f33c1c42756be10e2b9a83921072ba8d7687a0"} err="failed to get container status \"656e640701a9b950a553e031f4f33c1c42756be10e2b9a83921072ba8d7687a0\": rpc error: code = NotFound desc = could not find container \"656e640701a9b950a553e031f4f33c1c42756be10e2b9a83921072ba8d7687a0\": container with ID starting with 656e640701a9b950a553e031f4f33c1c42756be10e2b9a83921072ba8d7687a0 not found: ID does not exist" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.615878 4783 scope.go:117] "RemoveContainer" containerID="d98bf55b9245bf97f7d7575fec0f8ca9e0b1e81088b011780ac993458e19cdc3" Jan 31 09:19:33 crc kubenswrapper[4783]: E0131 09:19:33.616108 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d98bf55b9245bf97f7d7575fec0f8ca9e0b1e81088b011780ac993458e19cdc3\": container with ID starting with d98bf55b9245bf97f7d7575fec0f8ca9e0b1e81088b011780ac993458e19cdc3 not found: ID does not exist" containerID="d98bf55b9245bf97f7d7575fec0f8ca9e0b1e81088b011780ac993458e19cdc3" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.616133 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98bf55b9245bf97f7d7575fec0f8ca9e0b1e81088b011780ac993458e19cdc3"} err="failed to get container status \"d98bf55b9245bf97f7d7575fec0f8ca9e0b1e81088b011780ac993458e19cdc3\": rpc error: code = NotFound desc = could not find container \"d98bf55b9245bf97f7d7575fec0f8ca9e0b1e81088b011780ac993458e19cdc3\": container with ID starting with d98bf55b9245bf97f7d7575fec0f8ca9e0b1e81088b011780ac993458e19cdc3 not found: ID does not exist" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.616475 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43fedf89-07db-4c75-a8c3-9f04bbc9c6f7" (UID: "43fedf89-07db-4c75-a8c3-9f04bbc9c6f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.694635 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-697wt\" (UniqueName: \"kubernetes.io/projected/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-kube-api-access-697wt\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.694661 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.694672 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.835288 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vxv6q"] Jan 31 09:19:33 crc kubenswrapper[4783]: I0131 09:19:33.841324 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vxv6q"] Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.232962 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-676f7f866-qr2ck" podUID="5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:42140->10.217.0.160:9311: read: connection reset by peer" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.233220 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-676f7f866-qr2ck" podUID="5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:42130->10.217.0.160:9311: read: connection reset by peer" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.527414 4783 generic.go:334] "Generic (PLEG): container finished" podID="5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" containerID="0e192ff1a78289e7a4877820e9e898abbbee52d93fb499e3ba8df5ce67fc2553" exitCode=0 Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.527623 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-676f7f866-qr2ck" event={"ID":"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa","Type":"ContainerDied","Data":"0e192ff1a78289e7a4877820e9e898abbbee52d93fb499e3ba8df5ce67fc2553"} Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.533515 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.533608 4783 generic.go:334] "Generic (PLEG): container finished" podID="b557cb95-3ddb-4f51-857f-7e044b7975f3" containerID="b619429584aed0ba814e6d1fddab79ace3d922033a55f1aa9e0c15e390fd66fc" exitCode=0 Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.533678 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b557cb95-3ddb-4f51-857f-7e044b7975f3","Type":"ContainerDied","Data":"b619429584aed0ba814e6d1fddab79ace3d922033a55f1aa9e0c15e390fd66fc"} Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.533740 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b557cb95-3ddb-4f51-857f-7e044b7975f3","Type":"ContainerDied","Data":"9607c34d37758ead039cb10489743048b74467f5c77e50cf34302b21b22c1df3"} Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.533794 4783 scope.go:117] "RemoveContainer" containerID="98cbff8088c837cfc6cab464192835263274b4aa0653a5e965292b0636b7db5a" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.603926 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.604899 4783 scope.go:117] "RemoveContainer" containerID="b619429584aed0ba814e6d1fddab79ace3d922033a55f1aa9e0c15e390fd66fc" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.628061 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-config-data-custom\") pod \"b557cb95-3ddb-4f51-857f-7e044b7975f3\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.628189 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-combined-ca-bundle\") pod \"b557cb95-3ddb-4f51-857f-7e044b7975f3\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.628244 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-scripts\") pod \"b557cb95-3ddb-4f51-857f-7e044b7975f3\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.628305 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-config-data\") pod \"b557cb95-3ddb-4f51-857f-7e044b7975f3\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.628338 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptcz6\" (UniqueName: \"kubernetes.io/projected/b557cb95-3ddb-4f51-857f-7e044b7975f3-kube-api-access-ptcz6\") pod \"b557cb95-3ddb-4f51-857f-7e044b7975f3\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.628520 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b557cb95-3ddb-4f51-857f-7e044b7975f3-etc-machine-id\") pod \"b557cb95-3ddb-4f51-857f-7e044b7975f3\" (UID: \"b557cb95-3ddb-4f51-857f-7e044b7975f3\") " Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.628972 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b557cb95-3ddb-4f51-857f-7e044b7975f3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b557cb95-3ddb-4f51-857f-7e044b7975f3" (UID: "b557cb95-3ddb-4f51-857f-7e044b7975f3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.634803 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b557cb95-3ddb-4f51-857f-7e044b7975f3-kube-api-access-ptcz6" (OuterVolumeSpecName: "kube-api-access-ptcz6") pod "b557cb95-3ddb-4f51-857f-7e044b7975f3" (UID: "b557cb95-3ddb-4f51-857f-7e044b7975f3"). InnerVolumeSpecName "kube-api-access-ptcz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.635645 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-scripts" (OuterVolumeSpecName: "scripts") pod "b557cb95-3ddb-4f51-857f-7e044b7975f3" (UID: "b557cb95-3ddb-4f51-857f-7e044b7975f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.638281 4783 scope.go:117] "RemoveContainer" containerID="98cbff8088c837cfc6cab464192835263274b4aa0653a5e965292b0636b7db5a" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.638969 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98cbff8088c837cfc6cab464192835263274b4aa0653a5e965292b0636b7db5a\": container with ID starting with 98cbff8088c837cfc6cab464192835263274b4aa0653a5e965292b0636b7db5a not found: ID does not exist" containerID="98cbff8088c837cfc6cab464192835263274b4aa0653a5e965292b0636b7db5a" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.639016 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98cbff8088c837cfc6cab464192835263274b4aa0653a5e965292b0636b7db5a"} err="failed to get container status \"98cbff8088c837cfc6cab464192835263274b4aa0653a5e965292b0636b7db5a\": rpc error: code = NotFound desc = could not find container \"98cbff8088c837cfc6cab464192835263274b4aa0653a5e965292b0636b7db5a\": container with ID starting with 98cbff8088c837cfc6cab464192835263274b4aa0653a5e965292b0636b7db5a not found: ID does not exist" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.639048 4783 scope.go:117] "RemoveContainer" containerID="b619429584aed0ba814e6d1fddab79ace3d922033a55f1aa9e0c15e390fd66fc" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.639533 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b619429584aed0ba814e6d1fddab79ace3d922033a55f1aa9e0c15e390fd66fc\": container with ID starting with b619429584aed0ba814e6d1fddab79ace3d922033a55f1aa9e0c15e390fd66fc not found: ID does not exist" containerID="b619429584aed0ba814e6d1fddab79ace3d922033a55f1aa9e0c15e390fd66fc" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.639577 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b619429584aed0ba814e6d1fddab79ace3d922033a55f1aa9e0c15e390fd66fc"} err="failed to get container status \"b619429584aed0ba814e6d1fddab79ace3d922033a55f1aa9e0c15e390fd66fc\": rpc error: code = NotFound desc = could not find container \"b619429584aed0ba814e6d1fddab79ace3d922033a55f1aa9e0c15e390fd66fc\": container with ID starting with b619429584aed0ba814e6d1fddab79ace3d922033a55f1aa9e0c15e390fd66fc not found: ID does not exist" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.641021 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b557cb95-3ddb-4f51-857f-7e044b7975f3" (UID: "b557cb95-3ddb-4f51-857f-7e044b7975f3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.655464 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43fedf89-07db-4c75-a8c3-9f04bbc9c6f7" path="/var/lib/kubelet/pods/43fedf89-07db-4c75-a8c3-9f04bbc9c6f7/volumes" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.681550 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b557cb95-3ddb-4f51-857f-7e044b7975f3" (UID: "b557cb95-3ddb-4f51-857f-7e044b7975f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.725375 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-config-data" (OuterVolumeSpecName: "config-data") pod "b557cb95-3ddb-4f51-857f-7e044b7975f3" (UID: "b557cb95-3ddb-4f51-857f-7e044b7975f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.730258 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-logs\") pod \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.730356 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-config-data\") pod \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.730427 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-combined-ca-bundle\") pod \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.730619 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtt49\" (UniqueName: \"kubernetes.io/projected/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-kube-api-access-qtt49\") pod \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.731091 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-config-data-custom\") pod \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\" (UID: \"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa\") " Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.731535 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-logs" (OuterVolumeSpecName: "logs") pod "5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" (UID: "5541dd7e-5d9c-4d8a-a09a-82cebe829aaa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.732044 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.732064 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptcz6\" (UniqueName: \"kubernetes.io/projected/b557cb95-3ddb-4f51-857f-7e044b7975f3-kube-api-access-ptcz6\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.732077 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.732086 4783 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b557cb95-3ddb-4f51-857f-7e044b7975f3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.732095 4783 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.732103 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.732111 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b557cb95-3ddb-4f51-857f-7e044b7975f3-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.734043 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-kube-api-access-qtt49" (OuterVolumeSpecName: "kube-api-access-qtt49") pod "5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" (UID: "5541dd7e-5d9c-4d8a-a09a-82cebe829aaa"). InnerVolumeSpecName "kube-api-access-qtt49". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.734394 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" (UID: "5541dd7e-5d9c-4d8a-a09a-82cebe829aaa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.748486 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" (UID: "5541dd7e-5d9c-4d8a-a09a-82cebe829aaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.767346 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-config-data" (OuterVolumeSpecName: "config-data") pod "5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" (UID: "5541dd7e-5d9c-4d8a-a09a-82cebe829aaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.837538 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.837569 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.837589 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtt49\" (UniqueName: \"kubernetes.io/projected/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-kube-api-access-qtt49\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.837600 4783 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.856381 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cv69h"] Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.856794 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89312075-7597-4743-b92c-58411b26f1ec" containerName="horizon-log" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.856812 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="89312075-7597-4743-b92c-58411b26f1ec" containerName="horizon-log" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.856826 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" containerName="barbican-api-log" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.856832 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" containerName="barbican-api-log" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.856845 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349997ee-6053-4d85-8eae-1d4adf3b347e" containerName="horizon" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.856850 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="349997ee-6053-4d85-8eae-1d4adf3b347e" containerName="horizon" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.856857 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c" containerName="extract-content" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.856862 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c" containerName="extract-content" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.856870 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" containerName="barbican-api" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.856875 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" containerName="barbican-api" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.856885 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4276e01a-227a-4370-8b9b-cfc5123aa13d" containerName="neutron-httpd" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.856890 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4276e01a-227a-4370-8b9b-cfc5123aa13d" containerName="neutron-httpd" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.856899 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7684372-8d95-457c-b0a7-a58bb2cbf149" containerName="init" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.856905 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7684372-8d95-457c-b0a7-a58bb2cbf149" containerName="init" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.856914 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43fedf89-07db-4c75-a8c3-9f04bbc9c6f7" containerName="extract-utilities" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.856920 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fedf89-07db-4c75-a8c3-9f04bbc9c6f7" containerName="extract-utilities" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.856932 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4276e01a-227a-4370-8b9b-cfc5123aa13d" containerName="neutron-api" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.856937 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4276e01a-227a-4370-8b9b-cfc5123aa13d" containerName="neutron-api" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.856946 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7684372-8d95-457c-b0a7-a58bb2cbf149" containerName="dnsmasq-dns" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.856951 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7684372-8d95-457c-b0a7-a58bb2cbf149" containerName="dnsmasq-dns" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.856957 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e73bb130-1464-433b-b34d-4af489f73b46" containerName="horizon" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.856962 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e73bb130-1464-433b-b34d-4af489f73b46" containerName="horizon" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.856971 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c" containerName="registry-server" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.856977 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c" containerName="registry-server" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.856985 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b557cb95-3ddb-4f51-857f-7e044b7975f3" containerName="cinder-scheduler" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.856991 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="b557cb95-3ddb-4f51-857f-7e044b7975f3" containerName="cinder-scheduler" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.856996 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349997ee-6053-4d85-8eae-1d4adf3b347e" containerName="horizon-log" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857001 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="349997ee-6053-4d85-8eae-1d4adf3b347e" containerName="horizon-log" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.857011 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43fedf89-07db-4c75-a8c3-9f04bbc9c6f7" containerName="extract-content" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857016 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fedf89-07db-4c75-a8c3-9f04bbc9c6f7" containerName="extract-content" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.857023 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c" containerName="extract-utilities" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857027 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c" containerName="extract-utilities" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.857038 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89312075-7597-4743-b92c-58411b26f1ec" containerName="horizon" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857043 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="89312075-7597-4743-b92c-58411b26f1ec" containerName="horizon" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.857052 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b557cb95-3ddb-4f51-857f-7e044b7975f3" containerName="probe" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857057 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="b557cb95-3ddb-4f51-857f-7e044b7975f3" containerName="probe" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.857066 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e73bb130-1464-433b-b34d-4af489f73b46" containerName="horizon-log" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857071 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e73bb130-1464-433b-b34d-4af489f73b46" containerName="horizon-log" Jan 31 09:19:35 crc kubenswrapper[4783]: E0131 09:19:35.857077 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43fedf89-07db-4c75-a8c3-9f04bbc9c6f7" containerName="registry-server" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857082 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fedf89-07db-4c75-a8c3-9f04bbc9c6f7" containerName="registry-server" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857234 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="e73bb130-1464-433b-b34d-4af489f73b46" containerName="horizon" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857246 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="349997ee-6053-4d85-8eae-1d4adf3b347e" containerName="horizon" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857253 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7684372-8d95-457c-b0a7-a58bb2cbf149" containerName="dnsmasq-dns" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857262 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" containerName="barbican-api" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857271 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="e73bb130-1464-433b-b34d-4af489f73b46" containerName="horizon-log" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857275 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="89312075-7597-4743-b92c-58411b26f1ec" containerName="horizon" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857283 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4276e01a-227a-4370-8b9b-cfc5123aa13d" containerName="neutron-api" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857288 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" containerName="barbican-api-log" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857293 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="c53d07e9-43d7-4aaa-b5c7-2b56c8bb464c" containerName="registry-server" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857302 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="349997ee-6053-4d85-8eae-1d4adf3b347e" containerName="horizon-log" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857311 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4276e01a-227a-4370-8b9b-cfc5123aa13d" containerName="neutron-httpd" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857320 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="43fedf89-07db-4c75-a8c3-9f04bbc9c6f7" containerName="registry-server" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857328 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="89312075-7597-4743-b92c-58411b26f1ec" containerName="horizon-log" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857334 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="b557cb95-3ddb-4f51-857f-7e044b7975f3" containerName="probe" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.857343 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="b557cb95-3ddb-4f51-857f-7e044b7975f3" containerName="cinder-scheduler" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.858546 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.863095 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cv69h"] Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.939488 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-catalog-content\") pod \"community-operators-cv69h\" (UID: \"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce\") " pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.939768 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-utilities\") pod \"community-operators-cv69h\" (UID: \"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce\") " pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:35 crc kubenswrapper[4783]: I0131 09:19:35.939830 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffl8w\" (UniqueName: \"kubernetes.io/projected/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-kube-api-access-ffl8w\") pod \"community-operators-cv69h\" (UID: \"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce\") " pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.040935 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-utilities\") pod \"community-operators-cv69h\" (UID: \"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce\") " pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.040976 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffl8w\" (UniqueName: \"kubernetes.io/projected/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-kube-api-access-ffl8w\") pod \"community-operators-cv69h\" (UID: \"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce\") " pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.041013 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-catalog-content\") pod \"community-operators-cv69h\" (UID: \"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce\") " pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.041566 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-utilities\") pod \"community-operators-cv69h\" (UID: \"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce\") " pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.041805 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-catalog-content\") pod \"community-operators-cv69h\" (UID: \"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce\") " pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.055633 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffl8w\" (UniqueName: \"kubernetes.io/projected/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-kube-api-access-ffl8w\") pod \"community-operators-cv69h\" (UID: \"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce\") " pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.174235 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.547843 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-676f7f866-qr2ck" event={"ID":"5541dd7e-5d9c-4d8a-a09a-82cebe829aaa","Type":"ContainerDied","Data":"09be4ca1425941bb718a979077c9128c6d7473125a357da85490821cc225b829"} Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.547892 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-676f7f866-qr2ck" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.548620 4783 scope.go:117] "RemoveContainer" containerID="0e192ff1a78289e7a4877820e9e898abbbee52d93fb499e3ba8df5ce67fc2553" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.551334 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.574567 4783 scope.go:117] "RemoveContainer" containerID="69cba238106285873331f2043c781dbc4270d7edf1f10b318ab5d42268a7cf29" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.602623 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-676f7f866-qr2ck"] Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.618114 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-676f7f866-qr2ck"] Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.623537 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.628605 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.634760 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.636300 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.639539 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.640578 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.659055 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cv69h"] Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.766923 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltnsv\" (UniqueName: \"kubernetes.io/projected/1eb12305-93aa-4b0a-960a-939eb7b74bec-kube-api-access-ltnsv\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.767006 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eb12305-93aa-4b0a-960a-939eb7b74bec-scripts\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.767075 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eb12305-93aa-4b0a-960a-939eb7b74bec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.767104 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb12305-93aa-4b0a-960a-939eb7b74bec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.767280 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb12305-93aa-4b0a-960a-939eb7b74bec-config-data\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.767367 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1eb12305-93aa-4b0a-960a-939eb7b74bec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.869822 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1eb12305-93aa-4b0a-960a-939eb7b74bec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.869940 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltnsv\" (UniqueName: \"kubernetes.io/projected/1eb12305-93aa-4b0a-960a-939eb7b74bec-kube-api-access-ltnsv\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.869977 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eb12305-93aa-4b0a-960a-939eb7b74bec-scripts\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.869980 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1eb12305-93aa-4b0a-960a-939eb7b74bec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.870023 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eb12305-93aa-4b0a-960a-939eb7b74bec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.870071 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb12305-93aa-4b0a-960a-939eb7b74bec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.870102 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb12305-93aa-4b0a-960a-939eb7b74bec-config-data\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.875925 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb12305-93aa-4b0a-960a-939eb7b74bec-config-data\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.876144 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eb12305-93aa-4b0a-960a-939eb7b74bec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.876072 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb12305-93aa-4b0a-960a-939eb7b74bec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.876934 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eb12305-93aa-4b0a-960a-939eb7b74bec-scripts\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.883471 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltnsv\" (UniqueName: \"kubernetes.io/projected/1eb12305-93aa-4b0a-960a-939eb7b74bec-kube-api-access-ltnsv\") pod \"cinder-scheduler-0\" (UID: \"1eb12305-93aa-4b0a-960a-939eb7b74bec\") " pod="openstack/cinder-scheduler-0" Jan 31 09:19:36 crc kubenswrapper[4783]: I0131 09:19:36.954123 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 31 09:19:37 crc kubenswrapper[4783]: I0131 09:19:37.137142 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 31 09:19:37 crc kubenswrapper[4783]: I0131 09:19:37.363487 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:19:37 crc kubenswrapper[4783]: W0131 09:19:37.363741 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1eb12305_93aa_4b0a_960a_939eb7b74bec.slice/crio-3391343d281835e4649aa64ef9c10f11f87fad8f9c8aa8389b51cbe5409ee0ad WatchSource:0}: Error finding container 3391343d281835e4649aa64ef9c10f11f87fad8f9c8aa8389b51cbe5409ee0ad: Status 404 returned error can't find the container with id 3391343d281835e4649aa64ef9c10f11f87fad8f9c8aa8389b51cbe5409ee0ad Jan 31 09:19:37 crc kubenswrapper[4783]: I0131 09:19:37.573753 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1eb12305-93aa-4b0a-960a-939eb7b74bec","Type":"ContainerStarted","Data":"3391343d281835e4649aa64ef9c10f11f87fad8f9c8aa8389b51cbe5409ee0ad"} Jan 31 09:19:37 crc kubenswrapper[4783]: I0131 09:19:37.576745 4783 generic.go:334] "Generic (PLEG): container finished" podID="56bc0e52-0e88-44a1-a30e-e8b3bafba9ce" containerID="181efee9df1ee83e93ecc7545298584ea582d1b72484c250b0fcd142a8cf149d" exitCode=0 Jan 31 09:19:37 crc kubenswrapper[4783]: I0131 09:19:37.576784 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv69h" event={"ID":"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce","Type":"ContainerDied","Data":"181efee9df1ee83e93ecc7545298584ea582d1b72484c250b0fcd142a8cf149d"} Jan 31 09:19:37 crc kubenswrapper[4783]: I0131 09:19:37.576885 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv69h" event={"ID":"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce","Type":"ContainerStarted","Data":"ce8845413a11bfce138082094e7c761f40ed82f75c39084c84ceba0a21f49175"} Jan 31 09:19:37 crc kubenswrapper[4783]: I0131 09:19:37.658682 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5541dd7e-5d9c-4d8a-a09a-82cebe829aaa" path="/var/lib/kubelet/pods/5541dd7e-5d9c-4d8a-a09a-82cebe829aaa/volumes" Jan 31 09:19:37 crc kubenswrapper[4783]: I0131 09:19:37.659420 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b557cb95-3ddb-4f51-857f-7e044b7975f3" path="/var/lib/kubelet/pods/b557cb95-3ddb-4f51-857f-7e044b7975f3/volumes" Jan 31 09:19:38 crc kubenswrapper[4783]: I0131 09:19:38.639308 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1eb12305-93aa-4b0a-960a-939eb7b74bec","Type":"ContainerStarted","Data":"3b3155a7df9335a78237bd7d4072109d5d6cb2b1bf598a80dc06d320334462a0"} Jan 31 09:19:38 crc kubenswrapper[4783]: I0131 09:19:38.639909 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1eb12305-93aa-4b0a-960a-939eb7b74bec","Type":"ContainerStarted","Data":"9fbb54c982390739ac57ea03f6554f2a2d690436ddc02f3bf883cdb53b2a3756"} Jan 31 09:19:38 crc kubenswrapper[4783]: I0131 09:19:38.652955 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv69h" event={"ID":"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce","Type":"ContainerStarted","Data":"59da742cf8ef60781d68cf920dacf4196deead8dfa110a1f3e94706551f94f13"} Jan 31 09:19:38 crc kubenswrapper[4783]: I0131 09:19:38.671573 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.671551846 podStartE2EDuration="2.671551846s" podCreationTimestamp="2026-01-31 09:19:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:38.66389923 +0000 UTC m=+889.332582698" watchObservedRunningTime="2026-01-31 09:19:38.671551846 +0000 UTC m=+889.340235314" Jan 31 09:19:39 crc kubenswrapper[4783]: I0131 09:19:39.194023 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:39 crc kubenswrapper[4783]: I0131 09:19:39.265429 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:39 crc kubenswrapper[4783]: I0131 09:19:39.576218 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:39 crc kubenswrapper[4783]: I0131 09:19:39.612208 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-688b79757c-l8xjk" Jan 31 09:19:39 crc kubenswrapper[4783]: I0131 09:19:39.678504 4783 generic.go:334] "Generic (PLEG): container finished" podID="56bc0e52-0e88-44a1-a30e-e8b3bafba9ce" containerID="59da742cf8ef60781d68cf920dacf4196deead8dfa110a1f3e94706551f94f13" exitCode=0 Jan 31 09:19:39 crc kubenswrapper[4783]: I0131 09:19:39.678718 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv69h" event={"ID":"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce","Type":"ContainerDied","Data":"59da742cf8ef60781d68cf920dacf4196deead8dfa110a1f3e94706551f94f13"} Jan 31 09:19:39 crc kubenswrapper[4783]: I0131 09:19:39.863789 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5df45c7c98-nt6z5" Jan 31 09:19:39 crc kubenswrapper[4783]: I0131 09:19:39.869770 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:39 crc kubenswrapper[4783]: I0131 09:19:39.870592 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:39 crc kubenswrapper[4783]: I0131 09:19:39.917347 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-67b45945f8-49g2r"] Jan 31 09:19:40 crc kubenswrapper[4783]: I0131 09:19:40.688039 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv69h" event={"ID":"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce","Type":"ContainerStarted","Data":"ffef1e41e300079ad8a0d2518b587f3dd300ee04a9422528f1304ea7c148afc4"} Jan 31 09:19:40 crc kubenswrapper[4783]: I0131 09:19:40.707976 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cv69h" podStartSLOduration=3.109966011 podStartE2EDuration="5.707954971s" podCreationTimestamp="2026-01-31 09:19:35 +0000 UTC" firstStartedPulling="2026-01-31 09:19:37.580896349 +0000 UTC m=+888.249579818" lastFinishedPulling="2026-01-31 09:19:40.17888531 +0000 UTC m=+890.847568778" observedRunningTime="2026-01-31 09:19:40.70509551 +0000 UTC m=+891.373778979" watchObservedRunningTime="2026-01-31 09:19:40.707954971 +0000 UTC m=+891.376638439" Jan 31 09:19:41 crc kubenswrapper[4783]: E0131 09:19:41.124570 4783 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode52e03f3_fa41_487d_affe_89222406f4bb.slice/crio-84fdab98a2affa05717215feb4e4383412a44843ea778db228bbfc3433d763c6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca7e6f7a_4b59_42fd_9ef2_4f761e2d0af9.slice\": RecentStats: unable to find data in memory cache]" Jan 31 09:19:41 crc kubenswrapper[4783]: I0131 09:19:41.443777 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dmfcr"] Jan 31 09:19:41 crc kubenswrapper[4783]: I0131 09:19:41.444247 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dmfcr" podUID="f4f754dd-ba60-4f7b-b96b-ac8f2530250c" containerName="registry-server" containerID="cri-o://5ec55dd2413d0eacf1ee06eda3ffc9c4be3e3c6d5215ce3d51a046c8144443e4" gracePeriod=2 Jan 31 09:19:41 crc kubenswrapper[4783]: I0131 09:19:41.701887 4783 generic.go:334] "Generic (PLEG): container finished" podID="f4f754dd-ba60-4f7b-b96b-ac8f2530250c" containerID="5ec55dd2413d0eacf1ee06eda3ffc9c4be3e3c6d5215ce3d51a046c8144443e4" exitCode=0 Jan 31 09:19:41 crc kubenswrapper[4783]: I0131 09:19:41.701966 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmfcr" event={"ID":"f4f754dd-ba60-4f7b-b96b-ac8f2530250c","Type":"ContainerDied","Data":"5ec55dd2413d0eacf1ee06eda3ffc9c4be3e3c6d5215ce3d51a046c8144443e4"} Jan 31 09:19:41 crc kubenswrapper[4783]: I0131 09:19:41.702129 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-67b45945f8-49g2r" podUID="16e8a688-eb42-4c2b-a253-94f7ca54a51c" containerName="placement-log" containerID="cri-o://4282654d6b08dc89cac0b34ad107abd6b2a91d2ed7a800a188b8f034e39e90b1" gracePeriod=30 Jan 31 09:19:41 crc kubenswrapper[4783]: I0131 09:19:41.702241 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-67b45945f8-49g2r" podUID="16e8a688-eb42-4c2b-a253-94f7ca54a51c" containerName="placement-api" containerID="cri-o://04e9e8f69d87b4456bbb9e9898b32975f10e33346889f421c328ea108ddb16bb" gracePeriod=30 Jan 31 09:19:41 crc kubenswrapper[4783]: I0131 09:19:41.887882 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:41 crc kubenswrapper[4783]: I0131 09:19:41.955508 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 31 09:19:41 crc kubenswrapper[4783]: I0131 09:19:41.990413 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-utilities\") pod \"f4f754dd-ba60-4f7b-b96b-ac8f2530250c\" (UID: \"f4f754dd-ba60-4f7b-b96b-ac8f2530250c\") " Jan 31 09:19:41 crc kubenswrapper[4783]: I0131 09:19:41.990581 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m47rv\" (UniqueName: \"kubernetes.io/projected/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-kube-api-access-m47rv\") pod \"f4f754dd-ba60-4f7b-b96b-ac8f2530250c\" (UID: \"f4f754dd-ba60-4f7b-b96b-ac8f2530250c\") " Jan 31 09:19:41 crc kubenswrapper[4783]: I0131 09:19:41.990689 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-catalog-content\") pod \"f4f754dd-ba60-4f7b-b96b-ac8f2530250c\" (UID: \"f4f754dd-ba60-4f7b-b96b-ac8f2530250c\") " Jan 31 09:19:41 crc kubenswrapper[4783]: I0131 09:19:41.990892 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-utilities" (OuterVolumeSpecName: "utilities") pod "f4f754dd-ba60-4f7b-b96b-ac8f2530250c" (UID: "f4f754dd-ba60-4f7b-b96b-ac8f2530250c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:41 crc kubenswrapper[4783]: I0131 09:19:41.991193 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:41 crc kubenswrapper[4783]: I0131 09:19:41.997355 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-kube-api-access-m47rv" (OuterVolumeSpecName: "kube-api-access-m47rv") pod "f4f754dd-ba60-4f7b-b96b-ac8f2530250c" (UID: "f4f754dd-ba60-4f7b-b96b-ac8f2530250c"). InnerVolumeSpecName "kube-api-access-m47rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:42 crc kubenswrapper[4783]: I0131 09:19:42.089330 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4f754dd-ba60-4f7b-b96b-ac8f2530250c" (UID: "f4f754dd-ba60-4f7b-b96b-ac8f2530250c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:42 crc kubenswrapper[4783]: I0131 09:19:42.093944 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m47rv\" (UniqueName: \"kubernetes.io/projected/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-kube-api-access-m47rv\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:42 crc kubenswrapper[4783]: I0131 09:19:42.093971 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f754dd-ba60-4f7b-b96b-ac8f2530250c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:42 crc kubenswrapper[4783]: I0131 09:19:42.709410 4783 generic.go:334] "Generic (PLEG): container finished" podID="16e8a688-eb42-4c2b-a253-94f7ca54a51c" containerID="4282654d6b08dc89cac0b34ad107abd6b2a91d2ed7a800a188b8f034e39e90b1" exitCode=143 Jan 31 09:19:42 crc kubenswrapper[4783]: I0131 09:19:42.709757 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b45945f8-49g2r" event={"ID":"16e8a688-eb42-4c2b-a253-94f7ca54a51c","Type":"ContainerDied","Data":"4282654d6b08dc89cac0b34ad107abd6b2a91d2ed7a800a188b8f034e39e90b1"} Jan 31 09:19:42 crc kubenswrapper[4783]: I0131 09:19:42.711244 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmfcr" event={"ID":"f4f754dd-ba60-4f7b-b96b-ac8f2530250c","Type":"ContainerDied","Data":"52457fa5f831f47e074ecd4c7e736c7748e6a2513a1475254742614bed8e509d"} Jan 31 09:19:42 crc kubenswrapper[4783]: I0131 09:19:42.711281 4783 scope.go:117] "RemoveContainer" containerID="5ec55dd2413d0eacf1ee06eda3ffc9c4be3e3c6d5215ce3d51a046c8144443e4" Jan 31 09:19:42 crc kubenswrapper[4783]: I0131 09:19:42.711394 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dmfcr" Jan 31 09:19:42 crc kubenswrapper[4783]: I0131 09:19:42.737638 4783 scope.go:117] "RemoveContainer" containerID="ae7fce32118e69b843e67e5267f1b57a0b020c45a6d104f24057490ed9b3003d" Jan 31 09:19:42 crc kubenswrapper[4783]: I0131 09:19:42.759499 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dmfcr"] Jan 31 09:19:42 crc kubenswrapper[4783]: I0131 09:19:42.763413 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dmfcr"] Jan 31 09:19:42 crc kubenswrapper[4783]: I0131 09:19:42.763784 4783 scope.go:117] "RemoveContainer" containerID="c1dc630b83a51d2dcba2bae844cc4212d79e8d89278eba7264f1d2987846d042" Jan 31 09:19:42 crc kubenswrapper[4783]: I0131 09:19:42.776844 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-f5ff596f4-ffmss" podUID="8dda3593-0628-4253-995b-b662d252462e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.161422 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 31 09:19:43 crc kubenswrapper[4783]: E0131 09:19:43.161862 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f754dd-ba60-4f7b-b96b-ac8f2530250c" containerName="extract-content" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.161881 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f754dd-ba60-4f7b-b96b-ac8f2530250c" containerName="extract-content" Jan 31 09:19:43 crc kubenswrapper[4783]: E0131 09:19:43.161899 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f754dd-ba60-4f7b-b96b-ac8f2530250c" containerName="extract-utilities" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.161905 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f754dd-ba60-4f7b-b96b-ac8f2530250c" containerName="extract-utilities" Jan 31 09:19:43 crc kubenswrapper[4783]: E0131 09:19:43.161935 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f754dd-ba60-4f7b-b96b-ac8f2530250c" containerName="registry-server" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.161941 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f754dd-ba60-4f7b-b96b-ac8f2530250c" containerName="registry-server" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.162119 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f754dd-ba60-4f7b-b96b-ac8f2530250c" containerName="registry-server" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.162749 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.164562 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.164624 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.164684 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-2kj4m" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.173150 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.318705 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1789966-6119-4be7-87b8-cca3381fc380-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b1789966-6119-4be7-87b8-cca3381fc380\") " pod="openstack/openstackclient" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.319404 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hkhx\" (UniqueName: \"kubernetes.io/projected/b1789966-6119-4be7-87b8-cca3381fc380-kube-api-access-2hkhx\") pod \"openstackclient\" (UID: \"b1789966-6119-4be7-87b8-cca3381fc380\") " pod="openstack/openstackclient" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.319699 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b1789966-6119-4be7-87b8-cca3381fc380-openstack-config-secret\") pod \"openstackclient\" (UID: \"b1789966-6119-4be7-87b8-cca3381fc380\") " pod="openstack/openstackclient" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.319790 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b1789966-6119-4be7-87b8-cca3381fc380-openstack-config\") pod \"openstackclient\" (UID: \"b1789966-6119-4be7-87b8-cca3381fc380\") " pod="openstack/openstackclient" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.422648 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hkhx\" (UniqueName: \"kubernetes.io/projected/b1789966-6119-4be7-87b8-cca3381fc380-kube-api-access-2hkhx\") pod \"openstackclient\" (UID: \"b1789966-6119-4be7-87b8-cca3381fc380\") " pod="openstack/openstackclient" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.422756 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b1789966-6119-4be7-87b8-cca3381fc380-openstack-config-secret\") pod \"openstackclient\" (UID: \"b1789966-6119-4be7-87b8-cca3381fc380\") " pod="openstack/openstackclient" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.422791 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b1789966-6119-4be7-87b8-cca3381fc380-openstack-config\") pod \"openstackclient\" (UID: \"b1789966-6119-4be7-87b8-cca3381fc380\") " pod="openstack/openstackclient" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.422865 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1789966-6119-4be7-87b8-cca3381fc380-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b1789966-6119-4be7-87b8-cca3381fc380\") " pod="openstack/openstackclient" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.423898 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b1789966-6119-4be7-87b8-cca3381fc380-openstack-config\") pod \"openstackclient\" (UID: \"b1789966-6119-4be7-87b8-cca3381fc380\") " pod="openstack/openstackclient" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.428321 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b1789966-6119-4be7-87b8-cca3381fc380-openstack-config-secret\") pod \"openstackclient\" (UID: \"b1789966-6119-4be7-87b8-cca3381fc380\") " pod="openstack/openstackclient" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.428836 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1789966-6119-4be7-87b8-cca3381fc380-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b1789966-6119-4be7-87b8-cca3381fc380\") " pod="openstack/openstackclient" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.446414 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hkhx\" (UniqueName: \"kubernetes.io/projected/b1789966-6119-4be7-87b8-cca3381fc380-kube-api-access-2hkhx\") pod \"openstackclient\" (UID: \"b1789966-6119-4be7-87b8-cca3381fc380\") " pod="openstack/openstackclient" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.475881 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.659257 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f754dd-ba60-4f7b-b96b-ac8f2530250c" path="/var/lib/kubelet/pods/f4f754dd-ba60-4f7b-b96b-ac8f2530250c/volumes" Jan 31 09:19:43 crc kubenswrapper[4783]: I0131 09:19:43.871912 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 31 09:19:43 crc kubenswrapper[4783]: W0131 09:19:43.882322 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1789966_6119_4be7_87b8_cca3381fc380.slice/crio-1b20ea3d97ffa5cf24d192cac5858d2eb658ab61234443c81361afa05aa570c4 WatchSource:0}: Error finding container 1b20ea3d97ffa5cf24d192cac5858d2eb658ab61234443c81361afa05aa570c4: Status 404 returned error can't find the container with id 1b20ea3d97ffa5cf24d192cac5858d2eb658ab61234443c81361afa05aa570c4 Jan 31 09:19:44 crc kubenswrapper[4783]: I0131 09:19:44.710491 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:19:44 crc kubenswrapper[4783]: I0131 09:19:44.710948 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerName="ceilometer-central-agent" containerID="cri-o://62547da9a9d9e7776b7d2cb74df76089a4ad63f3d057be33997a9f3d43c7d067" gracePeriod=30 Jan 31 09:19:44 crc kubenswrapper[4783]: I0131 09:19:44.711050 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerName="ceilometer-notification-agent" containerID="cri-o://57ba25d3a96f9f4dd05e25d5bc530bf8d9197ceb128f459ceda40b87a1acf8af" gracePeriod=30 Jan 31 09:19:44 crc kubenswrapper[4783]: I0131 09:19:44.711079 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerName="proxy-httpd" containerID="cri-o://0eabe8be1dd00d71d8bb61e0590a11c060639b160b551043f0e16e7f720243de" gracePeriod=30 Jan 31 09:19:44 crc kubenswrapper[4783]: I0131 09:19:44.711075 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerName="sg-core" containerID="cri-o://440261fa003f5b096693ea8d38f0321e683a20df2eeb513684ecfd57bfdf33fd" gracePeriod=30 Jan 31 09:19:44 crc kubenswrapper[4783]: I0131 09:19:44.714193 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 31 09:19:44 crc kubenswrapper[4783]: I0131 09:19:44.739637 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b1789966-6119-4be7-87b8-cca3381fc380","Type":"ContainerStarted","Data":"1b20ea3d97ffa5cf24d192cac5858d2eb658ab61234443c81361afa05aa570c4"} Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.264294 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.377224 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16e8a688-eb42-4c2b-a253-94f7ca54a51c-logs\") pod \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.377303 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-scripts\") pod \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.377323 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-combined-ca-bundle\") pod \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.377352 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-public-tls-certs\") pod \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.377371 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8v8k\" (UniqueName: \"kubernetes.io/projected/16e8a688-eb42-4c2b-a253-94f7ca54a51c-kube-api-access-v8v8k\") pod \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.377394 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-internal-tls-certs\") pod \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.377412 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-config-data\") pod \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\" (UID: \"16e8a688-eb42-4c2b-a253-94f7ca54a51c\") " Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.378354 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16e8a688-eb42-4c2b-a253-94f7ca54a51c-logs" (OuterVolumeSpecName: "logs") pod "16e8a688-eb42-4c2b-a253-94f7ca54a51c" (UID: "16e8a688-eb42-4c2b-a253-94f7ca54a51c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.386720 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16e8a688-eb42-4c2b-a253-94f7ca54a51c-kube-api-access-v8v8k" (OuterVolumeSpecName: "kube-api-access-v8v8k") pod "16e8a688-eb42-4c2b-a253-94f7ca54a51c" (UID: "16e8a688-eb42-4c2b-a253-94f7ca54a51c"). InnerVolumeSpecName "kube-api-access-v8v8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.386735 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-scripts" (OuterVolumeSpecName: "scripts") pod "16e8a688-eb42-4c2b-a253-94f7ca54a51c" (UID: "16e8a688-eb42-4c2b-a253-94f7ca54a51c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.422757 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16e8a688-eb42-4c2b-a253-94f7ca54a51c" (UID: "16e8a688-eb42-4c2b-a253-94f7ca54a51c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.425939 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-config-data" (OuterVolumeSpecName: "config-data") pod "16e8a688-eb42-4c2b-a253-94f7ca54a51c" (UID: "16e8a688-eb42-4c2b-a253-94f7ca54a51c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.452638 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "16e8a688-eb42-4c2b-a253-94f7ca54a51c" (UID: "16e8a688-eb42-4c2b-a253-94f7ca54a51c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.467413 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "16e8a688-eb42-4c2b-a253-94f7ca54a51c" (UID: "16e8a688-eb42-4c2b-a253-94f7ca54a51c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.480119 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.480154 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.480199 4783 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.480208 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8v8k\" (UniqueName: \"kubernetes.io/projected/16e8a688-eb42-4c2b-a253-94f7ca54a51c-kube-api-access-v8v8k\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.480219 4783 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.480228 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e8a688-eb42-4c2b-a253-94f7ca54a51c-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.480238 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16e8a688-eb42-4c2b-a253-94f7ca54a51c-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.758499 4783 generic.go:334] "Generic (PLEG): container finished" podID="16e8a688-eb42-4c2b-a253-94f7ca54a51c" containerID="04e9e8f69d87b4456bbb9e9898b32975f10e33346889f421c328ea108ddb16bb" exitCode=0 Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.758602 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67b45945f8-49g2r" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.758593 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b45945f8-49g2r" event={"ID":"16e8a688-eb42-4c2b-a253-94f7ca54a51c","Type":"ContainerDied","Data":"04e9e8f69d87b4456bbb9e9898b32975f10e33346889f421c328ea108ddb16bb"} Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.758740 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b45945f8-49g2r" event={"ID":"16e8a688-eb42-4c2b-a253-94f7ca54a51c","Type":"ContainerDied","Data":"737f3858aa52343ae55b746f16f2d67a6b9d6780e21521c93d87bb5310a3bb43"} Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.758765 4783 scope.go:117] "RemoveContainer" containerID="04e9e8f69d87b4456bbb9e9898b32975f10e33346889f421c328ea108ddb16bb" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.764201 4783 generic.go:334] "Generic (PLEG): container finished" podID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerID="0eabe8be1dd00d71d8bb61e0590a11c060639b160b551043f0e16e7f720243de" exitCode=0 Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.764229 4783 generic.go:334] "Generic (PLEG): container finished" podID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerID="440261fa003f5b096693ea8d38f0321e683a20df2eeb513684ecfd57bfdf33fd" exitCode=2 Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.764238 4783 generic.go:334] "Generic (PLEG): container finished" podID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerID="62547da9a9d9e7776b7d2cb74df76089a4ad63f3d057be33997a9f3d43c7d067" exitCode=0 Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.764261 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1651d956-f474-4825-a9a7-c9a350d3e2b3","Type":"ContainerDied","Data":"0eabe8be1dd00d71d8bb61e0590a11c060639b160b551043f0e16e7f720243de"} Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.764288 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1651d956-f474-4825-a9a7-c9a350d3e2b3","Type":"ContainerDied","Data":"440261fa003f5b096693ea8d38f0321e683a20df2eeb513684ecfd57bfdf33fd"} Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.764300 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1651d956-f474-4825-a9a7-c9a350d3e2b3","Type":"ContainerDied","Data":"62547da9a9d9e7776b7d2cb74df76089a4ad63f3d057be33997a9f3d43c7d067"} Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.785437 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-67b45945f8-49g2r"] Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.793374 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-67b45945f8-49g2r"] Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.794413 4783 scope.go:117] "RemoveContainer" containerID="4282654d6b08dc89cac0b34ad107abd6b2a91d2ed7a800a188b8f034e39e90b1" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.825103 4783 scope.go:117] "RemoveContainer" containerID="04e9e8f69d87b4456bbb9e9898b32975f10e33346889f421c328ea108ddb16bb" Jan 31 09:19:45 crc kubenswrapper[4783]: E0131 09:19:45.826249 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04e9e8f69d87b4456bbb9e9898b32975f10e33346889f421c328ea108ddb16bb\": container with ID starting with 04e9e8f69d87b4456bbb9e9898b32975f10e33346889f421c328ea108ddb16bb not found: ID does not exist" containerID="04e9e8f69d87b4456bbb9e9898b32975f10e33346889f421c328ea108ddb16bb" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.826284 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04e9e8f69d87b4456bbb9e9898b32975f10e33346889f421c328ea108ddb16bb"} err="failed to get container status \"04e9e8f69d87b4456bbb9e9898b32975f10e33346889f421c328ea108ddb16bb\": rpc error: code = NotFound desc = could not find container \"04e9e8f69d87b4456bbb9e9898b32975f10e33346889f421c328ea108ddb16bb\": container with ID starting with 04e9e8f69d87b4456bbb9e9898b32975f10e33346889f421c328ea108ddb16bb not found: ID does not exist" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.826307 4783 scope.go:117] "RemoveContainer" containerID="4282654d6b08dc89cac0b34ad107abd6b2a91d2ed7a800a188b8f034e39e90b1" Jan 31 09:19:45 crc kubenswrapper[4783]: E0131 09:19:45.829365 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4282654d6b08dc89cac0b34ad107abd6b2a91d2ed7a800a188b8f034e39e90b1\": container with ID starting with 4282654d6b08dc89cac0b34ad107abd6b2a91d2ed7a800a188b8f034e39e90b1 not found: ID does not exist" containerID="4282654d6b08dc89cac0b34ad107abd6b2a91d2ed7a800a188b8f034e39e90b1" Jan 31 09:19:45 crc kubenswrapper[4783]: I0131 09:19:45.829395 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4282654d6b08dc89cac0b34ad107abd6b2a91d2ed7a800a188b8f034e39e90b1"} err="failed to get container status \"4282654d6b08dc89cac0b34ad107abd6b2a91d2ed7a800a188b8f034e39e90b1\": rpc error: code = NotFound desc = could not find container \"4282654d6b08dc89cac0b34ad107abd6b2a91d2ed7a800a188b8f034e39e90b1\": container with ID starting with 4282654d6b08dc89cac0b34ad107abd6b2a91d2ed7a800a188b8f034e39e90b1 not found: ID does not exist" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.169040 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.174589 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.174636 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.235611 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.300864 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1651d956-f474-4825-a9a7-c9a350d3e2b3-log-httpd\") pod \"1651d956-f474-4825-a9a7-c9a350d3e2b3\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.301010 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzsqf\" (UniqueName: \"kubernetes.io/projected/1651d956-f474-4825-a9a7-c9a350d3e2b3-kube-api-access-kzsqf\") pod \"1651d956-f474-4825-a9a7-c9a350d3e2b3\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.301109 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-config-data\") pod \"1651d956-f474-4825-a9a7-c9a350d3e2b3\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.301207 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1651d956-f474-4825-a9a7-c9a350d3e2b3-run-httpd\") pod \"1651d956-f474-4825-a9a7-c9a350d3e2b3\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.301292 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-scripts\") pod \"1651d956-f474-4825-a9a7-c9a350d3e2b3\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.301315 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-combined-ca-bundle\") pod \"1651d956-f474-4825-a9a7-c9a350d3e2b3\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.301398 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-sg-core-conf-yaml\") pod \"1651d956-f474-4825-a9a7-c9a350d3e2b3\" (UID: \"1651d956-f474-4825-a9a7-c9a350d3e2b3\") " Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.303833 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1651d956-f474-4825-a9a7-c9a350d3e2b3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1651d956-f474-4825-a9a7-c9a350d3e2b3" (UID: "1651d956-f474-4825-a9a7-c9a350d3e2b3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.305568 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1651d956-f474-4825-a9a7-c9a350d3e2b3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1651d956-f474-4825-a9a7-c9a350d3e2b3" (UID: "1651d956-f474-4825-a9a7-c9a350d3e2b3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.308988 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-scripts" (OuterVolumeSpecName: "scripts") pod "1651d956-f474-4825-a9a7-c9a350d3e2b3" (UID: "1651d956-f474-4825-a9a7-c9a350d3e2b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.319848 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1651d956-f474-4825-a9a7-c9a350d3e2b3-kube-api-access-kzsqf" (OuterVolumeSpecName: "kube-api-access-kzsqf") pod "1651d956-f474-4825-a9a7-c9a350d3e2b3" (UID: "1651d956-f474-4825-a9a7-c9a350d3e2b3"). InnerVolumeSpecName "kube-api-access-kzsqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.329368 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1651d956-f474-4825-a9a7-c9a350d3e2b3" (UID: "1651d956-f474-4825-a9a7-c9a350d3e2b3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.373613 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1651d956-f474-4825-a9a7-c9a350d3e2b3" (UID: "1651d956-f474-4825-a9a7-c9a350d3e2b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.389331 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-config-data" (OuterVolumeSpecName: "config-data") pod "1651d956-f474-4825-a9a7-c9a350d3e2b3" (UID: "1651d956-f474-4825-a9a7-c9a350d3e2b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.405734 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.405768 4783 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1651d956-f474-4825-a9a7-c9a350d3e2b3-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.405779 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.405800 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.405817 4783 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1651d956-f474-4825-a9a7-c9a350d3e2b3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.405826 4783 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1651d956-f474-4825-a9a7-c9a350d3e2b3-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.405835 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzsqf\" (UniqueName: \"kubernetes.io/projected/1651d956-f474-4825-a9a7-c9a350d3e2b3-kube-api-access-kzsqf\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.777282 4783 generic.go:334] "Generic (PLEG): container finished" podID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerID="57ba25d3a96f9f4dd05e25d5bc530bf8d9197ceb128f459ceda40b87a1acf8af" exitCode=0 Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.777479 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.778568 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1651d956-f474-4825-a9a7-c9a350d3e2b3","Type":"ContainerDied","Data":"57ba25d3a96f9f4dd05e25d5bc530bf8d9197ceb128f459ceda40b87a1acf8af"} Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.778628 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1651d956-f474-4825-a9a7-c9a350d3e2b3","Type":"ContainerDied","Data":"332623a71829e75f4be7d85100bd1a5193305cdc96ecc7aa6ca2a51251e5bf36"} Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.778652 4783 scope.go:117] "RemoveContainer" containerID="0eabe8be1dd00d71d8bb61e0590a11c060639b160b551043f0e16e7f720243de" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.811055 4783 scope.go:117] "RemoveContainer" containerID="440261fa003f5b096693ea8d38f0321e683a20df2eeb513684ecfd57bfdf33fd" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.814107 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.822312 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.835083 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:19:46 crc kubenswrapper[4783]: E0131 09:19:46.835550 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerName="proxy-httpd" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.835571 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerName="proxy-httpd" Jan 31 09:19:46 crc kubenswrapper[4783]: E0131 09:19:46.835584 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerName="sg-core" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.835593 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerName="sg-core" Jan 31 09:19:46 crc kubenswrapper[4783]: E0131 09:19:46.835605 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e8a688-eb42-4c2b-a253-94f7ca54a51c" containerName="placement-log" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.835611 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e8a688-eb42-4c2b-a253-94f7ca54a51c" containerName="placement-log" Jan 31 09:19:46 crc kubenswrapper[4783]: E0131 09:19:46.835623 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerName="ceilometer-central-agent" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.835629 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerName="ceilometer-central-agent" Jan 31 09:19:46 crc kubenswrapper[4783]: E0131 09:19:46.835639 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e8a688-eb42-4c2b-a253-94f7ca54a51c" containerName="placement-api" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.835646 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e8a688-eb42-4c2b-a253-94f7ca54a51c" containerName="placement-api" Jan 31 09:19:46 crc kubenswrapper[4783]: E0131 09:19:46.835657 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerName="ceilometer-notification-agent" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.835665 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerName="ceilometer-notification-agent" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.835854 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerName="ceilometer-central-agent" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.835870 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e8a688-eb42-4c2b-a253-94f7ca54a51c" containerName="placement-api" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.835879 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerName="proxy-httpd" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.835887 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e8a688-eb42-4c2b-a253-94f7ca54a51c" containerName="placement-log" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.835897 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerName="sg-core" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.835914 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" containerName="ceilometer-notification-agent" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.837970 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.839997 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.840637 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.844153 4783 scope.go:117] "RemoveContainer" containerID="57ba25d3a96f9f4dd05e25d5bc530bf8d9197ceb128f459ceda40b87a1acf8af" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.845708 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.863487 4783 scope.go:117] "RemoveContainer" containerID="62547da9a9d9e7776b7d2cb74df76089a4ad63f3d057be33997a9f3d43c7d067" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.891047 4783 scope.go:117] "RemoveContainer" containerID="0eabe8be1dd00d71d8bb61e0590a11c060639b160b551043f0e16e7f720243de" Jan 31 09:19:46 crc kubenswrapper[4783]: E0131 09:19:46.891532 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eabe8be1dd00d71d8bb61e0590a11c060639b160b551043f0e16e7f720243de\": container with ID starting with 0eabe8be1dd00d71d8bb61e0590a11c060639b160b551043f0e16e7f720243de not found: ID does not exist" containerID="0eabe8be1dd00d71d8bb61e0590a11c060639b160b551043f0e16e7f720243de" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.891571 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eabe8be1dd00d71d8bb61e0590a11c060639b160b551043f0e16e7f720243de"} err="failed to get container status \"0eabe8be1dd00d71d8bb61e0590a11c060639b160b551043f0e16e7f720243de\": rpc error: code = NotFound desc = could not find container \"0eabe8be1dd00d71d8bb61e0590a11c060639b160b551043f0e16e7f720243de\": container with ID starting with 0eabe8be1dd00d71d8bb61e0590a11c060639b160b551043f0e16e7f720243de not found: ID does not exist" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.891602 4783 scope.go:117] "RemoveContainer" containerID="440261fa003f5b096693ea8d38f0321e683a20df2eeb513684ecfd57bfdf33fd" Jan 31 09:19:46 crc kubenswrapper[4783]: E0131 09:19:46.891929 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"440261fa003f5b096693ea8d38f0321e683a20df2eeb513684ecfd57bfdf33fd\": container with ID starting with 440261fa003f5b096693ea8d38f0321e683a20df2eeb513684ecfd57bfdf33fd not found: ID does not exist" containerID="440261fa003f5b096693ea8d38f0321e683a20df2eeb513684ecfd57bfdf33fd" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.891967 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"440261fa003f5b096693ea8d38f0321e683a20df2eeb513684ecfd57bfdf33fd"} err="failed to get container status \"440261fa003f5b096693ea8d38f0321e683a20df2eeb513684ecfd57bfdf33fd\": rpc error: code = NotFound desc = could not find container \"440261fa003f5b096693ea8d38f0321e683a20df2eeb513684ecfd57bfdf33fd\": container with ID starting with 440261fa003f5b096693ea8d38f0321e683a20df2eeb513684ecfd57bfdf33fd not found: ID does not exist" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.891988 4783 scope.go:117] "RemoveContainer" containerID="57ba25d3a96f9f4dd05e25d5bc530bf8d9197ceb128f459ceda40b87a1acf8af" Jan 31 09:19:46 crc kubenswrapper[4783]: E0131 09:19:46.892428 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57ba25d3a96f9f4dd05e25d5bc530bf8d9197ceb128f459ceda40b87a1acf8af\": container with ID starting with 57ba25d3a96f9f4dd05e25d5bc530bf8d9197ceb128f459ceda40b87a1acf8af not found: ID does not exist" containerID="57ba25d3a96f9f4dd05e25d5bc530bf8d9197ceb128f459ceda40b87a1acf8af" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.892483 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57ba25d3a96f9f4dd05e25d5bc530bf8d9197ceb128f459ceda40b87a1acf8af"} err="failed to get container status \"57ba25d3a96f9f4dd05e25d5bc530bf8d9197ceb128f459ceda40b87a1acf8af\": rpc error: code = NotFound desc = could not find container \"57ba25d3a96f9f4dd05e25d5bc530bf8d9197ceb128f459ceda40b87a1acf8af\": container with ID starting with 57ba25d3a96f9f4dd05e25d5bc530bf8d9197ceb128f459ceda40b87a1acf8af not found: ID does not exist" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.892522 4783 scope.go:117] "RemoveContainer" containerID="62547da9a9d9e7776b7d2cb74df76089a4ad63f3d057be33997a9f3d43c7d067" Jan 31 09:19:46 crc kubenswrapper[4783]: E0131 09:19:46.893999 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62547da9a9d9e7776b7d2cb74df76089a4ad63f3d057be33997a9f3d43c7d067\": container with ID starting with 62547da9a9d9e7776b7d2cb74df76089a4ad63f3d057be33997a9f3d43c7d067 not found: ID does not exist" containerID="62547da9a9d9e7776b7d2cb74df76089a4ad63f3d057be33997a9f3d43c7d067" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.894023 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62547da9a9d9e7776b7d2cb74df76089a4ad63f3d057be33997a9f3d43c7d067"} err="failed to get container status \"62547da9a9d9e7776b7d2cb74df76089a4ad63f3d057be33997a9f3d43c7d067\": rpc error: code = NotFound desc = could not find container \"62547da9a9d9e7776b7d2cb74df76089a4ad63f3d057be33997a9f3d43c7d067\": container with ID starting with 62547da9a9d9e7776b7d2cb74df76089a4ad63f3d057be33997a9f3d43c7d067 not found: ID does not exist" Jan 31 09:19:46 crc kubenswrapper[4783]: I0131 09:19:46.898182 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.022407 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-config-data\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.022503 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-scripts\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.023156 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks4bx\" (UniqueName: \"kubernetes.io/projected/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-kube-api-access-ks4bx\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.023243 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.023381 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-run-httpd\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.023503 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-log-httpd\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.025206 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.129857 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.129969 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-config-data\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.130028 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-scripts\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.130085 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks4bx\" (UniqueName: \"kubernetes.io/projected/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-kube-api-access-ks4bx\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.130118 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.130209 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-run-httpd\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.130286 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-log-httpd\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.130880 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-log-httpd\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.130979 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-run-httpd\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.138105 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-scripts\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.138740 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-config-data\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.139628 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.140443 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.145243 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks4bx\" (UniqueName: \"kubernetes.io/projected/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-kube-api-access-ks4bx\") pod \"ceilometer-0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.155018 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.163324 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.592275 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:19:47 crc kubenswrapper[4783]: W0131 09:19:47.605047 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd14d4b5_f44b_47e2_b273_e05e6e9047e0.slice/crio-fc0674cad5e4cabc301332a9ad031ac7593799fe8b2a40102c405196626bda97 WatchSource:0}: Error finding container fc0674cad5e4cabc301332a9ad031ac7593799fe8b2a40102c405196626bda97: Status 404 returned error can't find the container with id fc0674cad5e4cabc301332a9ad031ac7593799fe8b2a40102c405196626bda97 Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.653842 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1651d956-f474-4825-a9a7-c9a350d3e2b3" path="/var/lib/kubelet/pods/1651d956-f474-4825-a9a7-c9a350d3e2b3/volumes" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.654669 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16e8a688-eb42-4c2b-a253-94f7ca54a51c" path="/var/lib/kubelet/pods/16e8a688-eb42-4c2b-a253-94f7ca54a51c/volumes" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.756965 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.757354 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.796234 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd14d4b5-f44b-47e2-b273-e05e6e9047e0","Type":"ContainerStarted","Data":"fc0674cad5e4cabc301332a9ad031ac7593799fe8b2a40102c405196626bda97"} Jan 31 09:19:47 crc kubenswrapper[4783]: I0131 09:19:47.845073 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cv69h"] Jan 31 09:19:48 crc kubenswrapper[4783]: I0131 09:19:48.806257 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cv69h" podUID="56bc0e52-0e88-44a1-a30e-e8b3bafba9ce" containerName="registry-server" containerID="cri-o://ffef1e41e300079ad8a0d2518b587f3dd300ee04a9422528f1304ea7c148afc4" gracePeriod=2 Jan 31 09:19:48 crc kubenswrapper[4783]: I0131 09:19:48.961230 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.072534 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7846c976fc-knpz2"] Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.074437 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.077110 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.077299 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.077482 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.080831 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7846c976fc-knpz2"] Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.280198 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4d96b6-b227-41e4-a653-39b8475aa9de-internal-tls-certs\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.280909 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4d96b6-b227-41e4-a653-39b8475aa9de-public-tls-certs\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.281247 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqfpq\" (UniqueName: \"kubernetes.io/projected/ff4d96b6-b227-41e4-a653-39b8475aa9de-kube-api-access-rqfpq\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.281275 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ff4d96b6-b227-41e4-a653-39b8475aa9de-etc-swift\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.281523 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff4d96b6-b227-41e4-a653-39b8475aa9de-run-httpd\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.281635 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4d96b6-b227-41e4-a653-39b8475aa9de-config-data\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.281867 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff4d96b6-b227-41e4-a653-39b8475aa9de-log-httpd\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.282225 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4d96b6-b227-41e4-a653-39b8475aa9de-combined-ca-bundle\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.383714 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4d96b6-b227-41e4-a653-39b8475aa9de-combined-ca-bundle\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.383783 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4d96b6-b227-41e4-a653-39b8475aa9de-internal-tls-certs\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.383856 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4d96b6-b227-41e4-a653-39b8475aa9de-public-tls-certs\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.383881 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqfpq\" (UniqueName: \"kubernetes.io/projected/ff4d96b6-b227-41e4-a653-39b8475aa9de-kube-api-access-rqfpq\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.383906 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ff4d96b6-b227-41e4-a653-39b8475aa9de-etc-swift\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.383933 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff4d96b6-b227-41e4-a653-39b8475aa9de-run-httpd\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.383974 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4d96b6-b227-41e4-a653-39b8475aa9de-config-data\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.383992 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff4d96b6-b227-41e4-a653-39b8475aa9de-log-httpd\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.384671 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff4d96b6-b227-41e4-a653-39b8475aa9de-log-httpd\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.384840 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff4d96b6-b227-41e4-a653-39b8475aa9de-run-httpd\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.395570 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4d96b6-b227-41e4-a653-39b8475aa9de-public-tls-certs\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.395867 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ff4d96b6-b227-41e4-a653-39b8475aa9de-etc-swift\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.396614 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4d96b6-b227-41e4-a653-39b8475aa9de-config-data\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.397244 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4d96b6-b227-41e4-a653-39b8475aa9de-internal-tls-certs\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.397258 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4d96b6-b227-41e4-a653-39b8475aa9de-combined-ca-bundle\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.402902 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqfpq\" (UniqueName: \"kubernetes.io/projected/ff4d96b6-b227-41e4-a653-39b8475aa9de-kube-api-access-rqfpq\") pod \"swift-proxy-7846c976fc-knpz2\" (UID: \"ff4d96b6-b227-41e4-a653-39b8475aa9de\") " pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.746323 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.831264 4783 generic.go:334] "Generic (PLEG): container finished" podID="56bc0e52-0e88-44a1-a30e-e8b3bafba9ce" containerID="ffef1e41e300079ad8a0d2518b587f3dd300ee04a9422528f1304ea7c148afc4" exitCode=0 Jan 31 09:19:49 crc kubenswrapper[4783]: I0131 09:19:49.831356 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv69h" event={"ID":"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce","Type":"ContainerDied","Data":"ffef1e41e300079ad8a0d2518b587f3dd300ee04a9422528f1304ea7c148afc4"} Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.273301 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-2s7lh"] Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.274350 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2s7lh" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.283648 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2s7lh"] Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.380975 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/149825bf-5cac-45b3-a51f-f569f43fa5d0-operator-scripts\") pod \"nova-api-db-create-2s7lh\" (UID: \"149825bf-5cac-45b3-a51f-f569f43fa5d0\") " pod="openstack/nova-api-db-create-2s7lh" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.381025 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpf6h\" (UniqueName: \"kubernetes.io/projected/149825bf-5cac-45b3-a51f-f569f43fa5d0-kube-api-access-gpf6h\") pod \"nova-api-db-create-2s7lh\" (UID: \"149825bf-5cac-45b3-a51f-f569f43fa5d0\") " pod="openstack/nova-api-db-create-2s7lh" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.391498 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-f95f-account-create-update-275t8"] Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.392845 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f95f-account-create-update-275t8" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.394509 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.395990 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f95f-account-create-update-275t8"] Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.483079 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46eac665-6761-4ff3-8718-6417ccea545d-operator-scripts\") pod \"nova-api-f95f-account-create-update-275t8\" (UID: \"46eac665-6761-4ff3-8718-6417ccea545d\") " pod="openstack/nova-api-f95f-account-create-update-275t8" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.483271 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/149825bf-5cac-45b3-a51f-f569f43fa5d0-operator-scripts\") pod \"nova-api-db-create-2s7lh\" (UID: \"149825bf-5cac-45b3-a51f-f569f43fa5d0\") " pod="openstack/nova-api-db-create-2s7lh" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.483304 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpf6h\" (UniqueName: \"kubernetes.io/projected/149825bf-5cac-45b3-a51f-f569f43fa5d0-kube-api-access-gpf6h\") pod \"nova-api-db-create-2s7lh\" (UID: \"149825bf-5cac-45b3-a51f-f569f43fa5d0\") " pod="openstack/nova-api-db-create-2s7lh" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.483778 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfvb6\" (UniqueName: \"kubernetes.io/projected/46eac665-6761-4ff3-8718-6417ccea545d-kube-api-access-wfvb6\") pod \"nova-api-f95f-account-create-update-275t8\" (UID: \"46eac665-6761-4ff3-8718-6417ccea545d\") " pod="openstack/nova-api-f95f-account-create-update-275t8" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.484241 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/149825bf-5cac-45b3-a51f-f569f43fa5d0-operator-scripts\") pod \"nova-api-db-create-2s7lh\" (UID: \"149825bf-5cac-45b3-a51f-f569f43fa5d0\") " pod="openstack/nova-api-db-create-2s7lh" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.496687 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-krxr5"] Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.498440 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-krxr5" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.503674 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpf6h\" (UniqueName: \"kubernetes.io/projected/149825bf-5cac-45b3-a51f-f569f43fa5d0-kube-api-access-gpf6h\") pod \"nova-api-db-create-2s7lh\" (UID: \"149825bf-5cac-45b3-a51f-f569f43fa5d0\") " pod="openstack/nova-api-db-create-2s7lh" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.505148 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-krxr5"] Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.580286 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-pqf78"] Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.581645 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pqf78" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.585225 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6263f4cc-b742-4057-96a4-d4a058ad3f44-operator-scripts\") pod \"nova-cell0-db-create-krxr5\" (UID: \"6263f4cc-b742-4057-96a4-d4a058ad3f44\") " pod="openstack/nova-cell0-db-create-krxr5" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.585315 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfvb6\" (UniqueName: \"kubernetes.io/projected/46eac665-6761-4ff3-8718-6417ccea545d-kube-api-access-wfvb6\") pod \"nova-api-f95f-account-create-update-275t8\" (UID: \"46eac665-6761-4ff3-8718-6417ccea545d\") " pod="openstack/nova-api-f95f-account-create-update-275t8" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.585381 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spfvl\" (UniqueName: \"kubernetes.io/projected/6263f4cc-b742-4057-96a4-d4a058ad3f44-kube-api-access-spfvl\") pod \"nova-cell0-db-create-krxr5\" (UID: \"6263f4cc-b742-4057-96a4-d4a058ad3f44\") " pod="openstack/nova-cell0-db-create-krxr5" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.585421 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46eac665-6761-4ff3-8718-6417ccea545d-operator-scripts\") pod \"nova-api-f95f-account-create-update-275t8\" (UID: \"46eac665-6761-4ff3-8718-6417ccea545d\") " pod="openstack/nova-api-f95f-account-create-update-275t8" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.586079 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46eac665-6761-4ff3-8718-6417ccea545d-operator-scripts\") pod \"nova-api-f95f-account-create-update-275t8\" (UID: \"46eac665-6761-4ff3-8718-6417ccea545d\") " pod="openstack/nova-api-f95f-account-create-update-275t8" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.589662 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-bf84-account-create-update-nq754"] Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.590906 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bf84-account-create-update-nq754" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.594371 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.595517 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2s7lh" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.605198 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-pqf78"] Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.614752 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-bf84-account-create-update-nq754"] Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.623792 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfvb6\" (UniqueName: \"kubernetes.io/projected/46eac665-6761-4ff3-8718-6417ccea545d-kube-api-access-wfvb6\") pod \"nova-api-f95f-account-create-update-275t8\" (UID: \"46eac665-6761-4ff3-8718-6417ccea545d\") " pod="openstack/nova-api-f95f-account-create-update-275t8" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.688995 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt6ww\" (UniqueName: \"kubernetes.io/projected/1094fbb7-1b91-4925-85b4-c9dafceb46c9-kube-api-access-zt6ww\") pod \"nova-cell0-bf84-account-create-update-nq754\" (UID: \"1094fbb7-1b91-4925-85b4-c9dafceb46c9\") " pod="openstack/nova-cell0-bf84-account-create-update-nq754" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.689113 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1094fbb7-1b91-4925-85b4-c9dafceb46c9-operator-scripts\") pod \"nova-cell0-bf84-account-create-update-nq754\" (UID: \"1094fbb7-1b91-4925-85b4-c9dafceb46c9\") " pod="openstack/nova-cell0-bf84-account-create-update-nq754" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.689148 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spfvl\" (UniqueName: \"kubernetes.io/projected/6263f4cc-b742-4057-96a4-d4a058ad3f44-kube-api-access-spfvl\") pod \"nova-cell0-db-create-krxr5\" (UID: \"6263f4cc-b742-4057-96a4-d4a058ad3f44\") " pod="openstack/nova-cell0-db-create-krxr5" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.689257 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86k9g\" (UniqueName: \"kubernetes.io/projected/7764a47f-6ccf-43f1-a787-99db87fb5cfb-kube-api-access-86k9g\") pod \"nova-cell1-db-create-pqf78\" (UID: \"7764a47f-6ccf-43f1-a787-99db87fb5cfb\") " pod="openstack/nova-cell1-db-create-pqf78" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.689375 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7764a47f-6ccf-43f1-a787-99db87fb5cfb-operator-scripts\") pod \"nova-cell1-db-create-pqf78\" (UID: \"7764a47f-6ccf-43f1-a787-99db87fb5cfb\") " pod="openstack/nova-cell1-db-create-pqf78" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.689400 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6263f4cc-b742-4057-96a4-d4a058ad3f44-operator-scripts\") pod \"nova-cell0-db-create-krxr5\" (UID: \"6263f4cc-b742-4057-96a4-d4a058ad3f44\") " pod="openstack/nova-cell0-db-create-krxr5" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.690233 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6263f4cc-b742-4057-96a4-d4a058ad3f44-operator-scripts\") pod \"nova-cell0-db-create-krxr5\" (UID: \"6263f4cc-b742-4057-96a4-d4a058ad3f44\") " pod="openstack/nova-cell0-db-create-krxr5" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.711885 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spfvl\" (UniqueName: \"kubernetes.io/projected/6263f4cc-b742-4057-96a4-d4a058ad3f44-kube-api-access-spfvl\") pod \"nova-cell0-db-create-krxr5\" (UID: \"6263f4cc-b742-4057-96a4-d4a058ad3f44\") " pod="openstack/nova-cell0-db-create-krxr5" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.712189 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f95f-account-create-update-275t8" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.783589 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-6717-account-create-update-p4bsh"] Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.784871 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6717-account-create-update-p4bsh" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.787054 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.789260 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6717-account-create-update-p4bsh"] Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.791710 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt6ww\" (UniqueName: \"kubernetes.io/projected/1094fbb7-1b91-4925-85b4-c9dafceb46c9-kube-api-access-zt6ww\") pod \"nova-cell0-bf84-account-create-update-nq754\" (UID: \"1094fbb7-1b91-4925-85b4-c9dafceb46c9\") " pod="openstack/nova-cell0-bf84-account-create-update-nq754" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.791786 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1094fbb7-1b91-4925-85b4-c9dafceb46c9-operator-scripts\") pod \"nova-cell0-bf84-account-create-update-nq754\" (UID: \"1094fbb7-1b91-4925-85b4-c9dafceb46c9\") " pod="openstack/nova-cell0-bf84-account-create-update-nq754" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.791817 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86k9g\" (UniqueName: \"kubernetes.io/projected/7764a47f-6ccf-43f1-a787-99db87fb5cfb-kube-api-access-86k9g\") pod \"nova-cell1-db-create-pqf78\" (UID: \"7764a47f-6ccf-43f1-a787-99db87fb5cfb\") " pod="openstack/nova-cell1-db-create-pqf78" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.791883 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7764a47f-6ccf-43f1-a787-99db87fb5cfb-operator-scripts\") pod \"nova-cell1-db-create-pqf78\" (UID: \"7764a47f-6ccf-43f1-a787-99db87fb5cfb\") " pod="openstack/nova-cell1-db-create-pqf78" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.792921 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1094fbb7-1b91-4925-85b4-c9dafceb46c9-operator-scripts\") pod \"nova-cell0-bf84-account-create-update-nq754\" (UID: \"1094fbb7-1b91-4925-85b4-c9dafceb46c9\") " pod="openstack/nova-cell0-bf84-account-create-update-nq754" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.792943 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7764a47f-6ccf-43f1-a787-99db87fb5cfb-operator-scripts\") pod \"nova-cell1-db-create-pqf78\" (UID: \"7764a47f-6ccf-43f1-a787-99db87fb5cfb\") " pod="openstack/nova-cell1-db-create-pqf78" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.813960 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86k9g\" (UniqueName: \"kubernetes.io/projected/7764a47f-6ccf-43f1-a787-99db87fb5cfb-kube-api-access-86k9g\") pod \"nova-cell1-db-create-pqf78\" (UID: \"7764a47f-6ccf-43f1-a787-99db87fb5cfb\") " pod="openstack/nova-cell1-db-create-pqf78" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.814277 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt6ww\" (UniqueName: \"kubernetes.io/projected/1094fbb7-1b91-4925-85b4-c9dafceb46c9-kube-api-access-zt6ww\") pod \"nova-cell0-bf84-account-create-update-nq754\" (UID: \"1094fbb7-1b91-4925-85b4-c9dafceb46c9\") " pod="openstack/nova-cell0-bf84-account-create-update-nq754" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.853288 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-krxr5" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.893501 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2vpz\" (UniqueName: \"kubernetes.io/projected/35ed13cd-8052-462d-bbb3-7d2863d38c2e-kube-api-access-b2vpz\") pod \"nova-cell1-6717-account-create-update-p4bsh\" (UID: \"35ed13cd-8052-462d-bbb3-7d2863d38c2e\") " pod="openstack/nova-cell1-6717-account-create-update-p4bsh" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.893834 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ed13cd-8052-462d-bbb3-7d2863d38c2e-operator-scripts\") pod \"nova-cell1-6717-account-create-update-p4bsh\" (UID: \"35ed13cd-8052-462d-bbb3-7d2863d38c2e\") " pod="openstack/nova-cell1-6717-account-create-update-p4bsh" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.896229 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pqf78" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.910623 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bf84-account-create-update-nq754" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.996625 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ed13cd-8052-462d-bbb3-7d2863d38c2e-operator-scripts\") pod \"nova-cell1-6717-account-create-update-p4bsh\" (UID: \"35ed13cd-8052-462d-bbb3-7d2863d38c2e\") " pod="openstack/nova-cell1-6717-account-create-update-p4bsh" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.996695 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2vpz\" (UniqueName: \"kubernetes.io/projected/35ed13cd-8052-462d-bbb3-7d2863d38c2e-kube-api-access-b2vpz\") pod \"nova-cell1-6717-account-create-update-p4bsh\" (UID: \"35ed13cd-8052-462d-bbb3-7d2863d38c2e\") " pod="openstack/nova-cell1-6717-account-create-update-p4bsh" Jan 31 09:19:50 crc kubenswrapper[4783]: I0131 09:19:50.997534 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ed13cd-8052-462d-bbb3-7d2863d38c2e-operator-scripts\") pod \"nova-cell1-6717-account-create-update-p4bsh\" (UID: \"35ed13cd-8052-462d-bbb3-7d2863d38c2e\") " pod="openstack/nova-cell1-6717-account-create-update-p4bsh" Jan 31 09:19:51 crc kubenswrapper[4783]: I0131 09:19:51.011590 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2vpz\" (UniqueName: \"kubernetes.io/projected/35ed13cd-8052-462d-bbb3-7d2863d38c2e-kube-api-access-b2vpz\") pod \"nova-cell1-6717-account-create-update-p4bsh\" (UID: \"35ed13cd-8052-462d-bbb3-7d2863d38c2e\") " pod="openstack/nova-cell1-6717-account-create-update-p4bsh" Jan 31 09:19:51 crc kubenswrapper[4783]: I0131 09:19:51.102289 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6717-account-create-update-p4bsh" Jan 31 09:19:51 crc kubenswrapper[4783]: I0131 09:19:51.356476 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-754bf5467-627tt" Jan 31 09:19:51 crc kubenswrapper[4783]: I0131 09:19:51.422341 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-548c6d58db-rh9pc"] Jan 31 09:19:51 crc kubenswrapper[4783]: I0131 09:19:51.422590 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-548c6d58db-rh9pc" podUID="8c5f7b62-eadb-483b-b336-34e1ba8e881b" containerName="neutron-api" containerID="cri-o://20aefd71d61b3c915fbf37bbf3e29db31841607652a4d8ed8d8fad64aa04c572" gracePeriod=30 Jan 31 09:19:51 crc kubenswrapper[4783]: I0131 09:19:51.422658 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-548c6d58db-rh9pc" podUID="8c5f7b62-eadb-483b-b336-34e1ba8e881b" containerName="neutron-httpd" containerID="cri-o://eeeb2276e4b6fb5bc70d48279133cb29782cf70016c2a9c62c5c25a1d4f94cd2" gracePeriod=30 Jan 31 09:19:51 crc kubenswrapper[4783]: I0131 09:19:51.868638 4783 generic.go:334] "Generic (PLEG): container finished" podID="8c5f7b62-eadb-483b-b336-34e1ba8e881b" containerID="eeeb2276e4b6fb5bc70d48279133cb29782cf70016c2a9c62c5c25a1d4f94cd2" exitCode=0 Jan 31 09:19:51 crc kubenswrapper[4783]: I0131 09:19:51.868682 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-548c6d58db-rh9pc" event={"ID":"8c5f7b62-eadb-483b-b336-34e1ba8e881b","Type":"ContainerDied","Data":"eeeb2276e4b6fb5bc70d48279133cb29782cf70016c2a9c62c5c25a1d4f94cd2"} Jan 31 09:19:52 crc kubenswrapper[4783]: I0131 09:19:52.778854 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-f5ff596f4-ffmss" podUID="8dda3593-0628-4253-995b-b662d252462e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 31 09:19:52 crc kubenswrapper[4783]: I0131 09:19:52.779245 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:19:53 crc kubenswrapper[4783]: I0131 09:19:53.683784 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 09:19:53 crc kubenswrapper[4783]: I0131 09:19:53.684003 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" containerName="glance-log" containerID="cri-o://c62b41e75fbb1a433cfb1c2dff030c8b66835350f6bab80fb2356f17562c3834" gracePeriod=30 Jan 31 09:19:53 crc kubenswrapper[4783]: I0131 09:19:53.684114 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" containerName="glance-httpd" containerID="cri-o://5b01fff520a9d67f2f3f01f9adc270009f597046e676c124e99d31cbbc8438f3" gracePeriod=30 Jan 31 09:19:53 crc kubenswrapper[4783]: I0131 09:19:53.927731 4783 generic.go:334] "Generic (PLEG): container finished" podID="8c5f7b62-eadb-483b-b336-34e1ba8e881b" containerID="20aefd71d61b3c915fbf37bbf3e29db31841607652a4d8ed8d8fad64aa04c572" exitCode=0 Jan 31 09:19:53 crc kubenswrapper[4783]: I0131 09:19:53.927816 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-548c6d58db-rh9pc" event={"ID":"8c5f7b62-eadb-483b-b336-34e1ba8e881b","Type":"ContainerDied","Data":"20aefd71d61b3c915fbf37bbf3e29db31841607652a4d8ed8d8fad64aa04c572"} Jan 31 09:19:53 crc kubenswrapper[4783]: I0131 09:19:53.930225 4783 generic.go:334] "Generic (PLEG): container finished" podID="cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" containerID="c62b41e75fbb1a433cfb1c2dff030c8b66835350f6bab80fb2356f17562c3834" exitCode=143 Jan 31 09:19:53 crc kubenswrapper[4783]: I0131 09:19:53.930257 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7","Type":"ContainerDied","Data":"c62b41e75fbb1a433cfb1c2dff030c8b66835350f6bab80fb2356f17562c3834"} Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.058916 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.173557 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffl8w\" (UniqueName: \"kubernetes.io/projected/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-kube-api-access-ffl8w\") pod \"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce\" (UID: \"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce\") " Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.173804 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-utilities\") pod \"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce\" (UID: \"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce\") " Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.173947 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-catalog-content\") pod \"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce\" (UID: \"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce\") " Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.174454 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-utilities" (OuterVolumeSpecName: "utilities") pod "56bc0e52-0e88-44a1-a30e-e8b3bafba9ce" (UID: "56bc0e52-0e88-44a1-a30e-e8b3bafba9ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.186920 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-kube-api-access-ffl8w" (OuterVolumeSpecName: "kube-api-access-ffl8w") pod "56bc0e52-0e88-44a1-a30e-e8b3bafba9ce" (UID: "56bc0e52-0e88-44a1-a30e-e8b3bafba9ce"). InnerVolumeSpecName "kube-api-access-ffl8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.247378 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56bc0e52-0e88-44a1-a30e-e8b3bafba9ce" (UID: "56bc0e52-0e88-44a1-a30e-e8b3bafba9ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.275873 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffl8w\" (UniqueName: \"kubernetes.io/projected/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-kube-api-access-ffl8w\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.276039 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.276048 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.289533 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-bf84-account-create-update-nq754"] Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.374381 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.479807 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpgp8\" (UniqueName: \"kubernetes.io/projected/8c5f7b62-eadb-483b-b336-34e1ba8e881b-kube-api-access-bpgp8\") pod \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.480000 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-ovndb-tls-certs\") pod \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.480215 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-combined-ca-bundle\") pod \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.480332 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-httpd-config\") pod \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.480638 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-config\") pod \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\" (UID: \"8c5f7b62-eadb-483b-b336-34e1ba8e881b\") " Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.488759 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5f7b62-eadb-483b-b336-34e1ba8e881b-kube-api-access-bpgp8" (OuterVolumeSpecName: "kube-api-access-bpgp8") pod "8c5f7b62-eadb-483b-b336-34e1ba8e881b" (UID: "8c5f7b62-eadb-483b-b336-34e1ba8e881b"). InnerVolumeSpecName "kube-api-access-bpgp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.508263 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8c5f7b62-eadb-483b-b336-34e1ba8e881b" (UID: "8c5f7b62-eadb-483b-b336-34e1ba8e881b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.565989 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-config" (OuterVolumeSpecName: "config") pod "8c5f7b62-eadb-483b-b336-34e1ba8e881b" (UID: "8c5f7b62-eadb-483b-b336-34e1ba8e881b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.573868 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-krxr5"] Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.584275 4783 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.584310 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.584369 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpgp8\" (UniqueName: \"kubernetes.io/projected/8c5f7b62-eadb-483b-b336-34e1ba8e881b-kube-api-access-bpgp8\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.603232 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-pqf78"] Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.605092 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c5f7b62-eadb-483b-b336-34e1ba8e881b" (UID: "8c5f7b62-eadb-483b-b336-34e1ba8e881b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.613471 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6717-account-create-update-p4bsh"] Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.625881 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7846c976fc-knpz2"] Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.677611 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8c5f7b62-eadb-483b-b336-34e1ba8e881b" (UID: "8c5f7b62-eadb-483b-b336-34e1ba8e881b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.686883 4783 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.686908 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5f7b62-eadb-483b-b336-34e1ba8e881b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.736740 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.737044 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="95d6fbb2-49fd-4d85-8247-413da61c4c16" containerName="glance-log" containerID="cri-o://7b93ba43db8cca037d4ec83f5e78efbed6d4e07779129bd77b31e7f73945096c" gracePeriod=30 Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.737347 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="95d6fbb2-49fd-4d85-8247-413da61c4c16" containerName="glance-httpd" containerID="cri-o://467bb4b3b850268e9cc8f923dd2680181e0341456bb9481b712e6537c005e9b3" gracePeriod=30 Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.745982 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f95f-account-create-update-275t8"] Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.753449 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2s7lh"] Jan 31 09:19:54 crc kubenswrapper[4783]: W0131 09:19:54.774489 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46eac665_6761_4ff3_8718_6417ccea545d.slice/crio-9bf2612affa8ae4ba33e4c1d4ced70d6814af3f14da8a42e69f758d9bba332e8 WatchSource:0}: Error finding container 9bf2612affa8ae4ba33e4c1d4ced70d6814af3f14da8a42e69f758d9bba332e8: Status 404 returned error can't find the container with id 9bf2612affa8ae4ba33e4c1d4ced70d6814af3f14da8a42e69f758d9bba332e8 Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.939913 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-krxr5" event={"ID":"6263f4cc-b742-4057-96a4-d4a058ad3f44","Type":"ContainerStarted","Data":"3e67b98896dd94a61dfbca6f0105b41e1a458aa0fe2009b29f6d03281877ef8e"} Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.944715 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7846c976fc-knpz2" event={"ID":"ff4d96b6-b227-41e4-a653-39b8475aa9de","Type":"ContainerStarted","Data":"a1d6f71eec30dd31e16c29d631753b4cf0aa16366b59afcfb7546a720088c939"} Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.948540 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pqf78" event={"ID":"7764a47f-6ccf-43f1-a787-99db87fb5cfb","Type":"ContainerStarted","Data":"da621f0c6b25449f74357a874ab160b9f5e8e6fd904ac3987073a5c016969689"} Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.951720 4783 generic.go:334] "Generic (PLEG): container finished" podID="1094fbb7-1b91-4925-85b4-c9dafceb46c9" containerID="d594b64cb69b29074ba9f06f5ac28af6cff153585d9b4b4443c8ec1eabe1344d" exitCode=0 Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.951786 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bf84-account-create-update-nq754" event={"ID":"1094fbb7-1b91-4925-85b4-c9dafceb46c9","Type":"ContainerDied","Data":"d594b64cb69b29074ba9f06f5ac28af6cff153585d9b4b4443c8ec1eabe1344d"} Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.951813 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bf84-account-create-update-nq754" event={"ID":"1094fbb7-1b91-4925-85b4-c9dafceb46c9","Type":"ContainerStarted","Data":"2b30690308e652f38a40f02c6220ea5af58db7fbf7ad2a4c4217340a071ae51a"} Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.954712 4783 generic.go:334] "Generic (PLEG): container finished" podID="95d6fbb2-49fd-4d85-8247-413da61c4c16" containerID="7b93ba43db8cca037d4ec83f5e78efbed6d4e07779129bd77b31e7f73945096c" exitCode=143 Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.954775 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95d6fbb2-49fd-4d85-8247-413da61c4c16","Type":"ContainerDied","Data":"7b93ba43db8cca037d4ec83f5e78efbed6d4e07779129bd77b31e7f73945096c"} Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.956563 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b1789966-6119-4be7-87b8-cca3381fc380","Type":"ContainerStarted","Data":"9a9304822e5fdbcf05994f3dc713e97767aec42d60dda31ce5587f1a57c0e08f"} Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.960307 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6717-account-create-update-p4bsh" event={"ID":"35ed13cd-8052-462d-bbb3-7d2863d38c2e","Type":"ContainerStarted","Data":"7c3d2941941e632bfc5f9e5ba5a4e5330890c144ebd6f1a59fdf54dd43448c44"} Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.966181 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f95f-account-create-update-275t8" event={"ID":"46eac665-6761-4ff3-8718-6417ccea545d","Type":"ContainerStarted","Data":"9bf2612affa8ae4ba33e4c1d4ced70d6814af3f14da8a42e69f758d9bba332e8"} Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.973300 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd14d4b5-f44b-47e2-b273-e05e6e9047e0","Type":"ContainerStarted","Data":"a71d6a05ce109dbdccb37036d01610f4012d7012b7ea8b4ec54a99d5cd3cded3"} Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.975522 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2s7lh" event={"ID":"149825bf-5cac-45b3-a51f-f569f43fa5d0","Type":"ContainerStarted","Data":"7a948d5b3c41c612885d8f5940afc7b60735b76104bdefcb8d718c8186b4c7e8"} Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.977993 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-548c6d58db-rh9pc" event={"ID":"8c5f7b62-eadb-483b-b336-34e1ba8e881b","Type":"ContainerDied","Data":"4970428ea9d39e24a6102c5ea689c00f52b2dc12fcf95f98404c180c4af9d184"} Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.978010 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-548c6d58db-rh9pc" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.978031 4783 scope.go:117] "RemoveContainer" containerID="eeeb2276e4b6fb5bc70d48279133cb29782cf70016c2a9c62c5c25a1d4f94cd2" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.986772 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv69h" event={"ID":"56bc0e52-0e88-44a1-a30e-e8b3bafba9ce","Type":"ContainerDied","Data":"ce8845413a11bfce138082094e7c761f40ed82f75c39084c84ceba0a21f49175"} Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.986948 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cv69h" Jan 31 09:19:54 crc kubenswrapper[4783]: I0131 09:19:54.999104 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.984799965 podStartE2EDuration="11.999091807s" podCreationTimestamp="2026-01-31 09:19:43 +0000 UTC" firstStartedPulling="2026-01-31 09:19:43.885916108 +0000 UTC m=+894.554599576" lastFinishedPulling="2026-01-31 09:19:53.90020795 +0000 UTC m=+904.568891418" observedRunningTime="2026-01-31 09:19:54.987589464 +0000 UTC m=+905.656272932" watchObservedRunningTime="2026-01-31 09:19:54.999091807 +0000 UTC m=+905.667775276" Jan 31 09:19:55 crc kubenswrapper[4783]: I0131 09:19:55.000087 4783 scope.go:117] "RemoveContainer" containerID="20aefd71d61b3c915fbf37bbf3e29db31841607652a4d8ed8d8fad64aa04c572" Jan 31 09:19:55 crc kubenswrapper[4783]: I0131 09:19:55.040958 4783 scope.go:117] "RemoveContainer" containerID="ffef1e41e300079ad8a0d2518b587f3dd300ee04a9422528f1304ea7c148afc4" Jan 31 09:19:55 crc kubenswrapper[4783]: I0131 09:19:55.047830 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-548c6d58db-rh9pc"] Jan 31 09:19:55 crc kubenswrapper[4783]: I0131 09:19:55.055424 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-548c6d58db-rh9pc"] Jan 31 09:19:55 crc kubenswrapper[4783]: I0131 09:19:55.060318 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cv69h"] Jan 31 09:19:55 crc kubenswrapper[4783]: I0131 09:19:55.062666 4783 scope.go:117] "RemoveContainer" containerID="59da742cf8ef60781d68cf920dacf4196deead8dfa110a1f3e94706551f94f13" Jan 31 09:19:55 crc kubenswrapper[4783]: I0131 09:19:55.067614 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cv69h"] Jan 31 09:19:55 crc kubenswrapper[4783]: I0131 09:19:55.092710 4783 scope.go:117] "RemoveContainer" containerID="181efee9df1ee83e93ecc7545298584ea582d1b72484c250b0fcd142a8cf149d" Jan 31 09:19:55 crc kubenswrapper[4783]: I0131 09:19:55.667032 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56bc0e52-0e88-44a1-a30e-e8b3bafba9ce" path="/var/lib/kubelet/pods/56bc0e52-0e88-44a1-a30e-e8b3bafba9ce/volumes" Jan 31 09:19:55 crc kubenswrapper[4783]: I0131 09:19:55.667843 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c5f7b62-eadb-483b-b336-34e1ba8e881b" path="/var/lib/kubelet/pods/8c5f7b62-eadb-483b-b336-34e1ba8e881b/volumes" Jan 31 09:19:55 crc kubenswrapper[4783]: I0131 09:19:55.998544 4783 generic.go:334] "Generic (PLEG): container finished" podID="7764a47f-6ccf-43f1-a787-99db87fb5cfb" containerID="a1a0890df4932c8a0a7bc5c1f7ec446cf7562b17edfd7b01d7fd0a026d2b11d6" exitCode=0 Jan 31 09:19:55 crc kubenswrapper[4783]: I0131 09:19:55.998608 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pqf78" event={"ID":"7764a47f-6ccf-43f1-a787-99db87fb5cfb","Type":"ContainerDied","Data":"a1a0890df4932c8a0a7bc5c1f7ec446cf7562b17edfd7b01d7fd0a026d2b11d6"} Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.000536 4783 generic.go:334] "Generic (PLEG): container finished" podID="35ed13cd-8052-462d-bbb3-7d2863d38c2e" containerID="6834fb766d782505c282c414f27930a36a5e6731b75a0214876f329d8440066e" exitCode=0 Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.000584 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6717-account-create-update-p4bsh" event={"ID":"35ed13cd-8052-462d-bbb3-7d2863d38c2e","Type":"ContainerDied","Data":"6834fb766d782505c282c414f27930a36a5e6731b75a0214876f329d8440066e"} Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.002582 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd14d4b5-f44b-47e2-b273-e05e6e9047e0","Type":"ContainerStarted","Data":"481b3e6775785dd5c4702178be7a31cc5636ae5625de30d223f4ef4b3b479a60"} Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.002612 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd14d4b5-f44b-47e2-b273-e05e6e9047e0","Type":"ContainerStarted","Data":"ae2397cedc0d74e494b02de269b0ebdf09364ed0ceed643b9d190ce542b7c41c"} Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.004969 4783 generic.go:334] "Generic (PLEG): container finished" podID="46eac665-6761-4ff3-8718-6417ccea545d" containerID="634e4bb99cd4f4c0685080200c21099faa5c7885981ac72b80bb0ec74397d192" exitCode=0 Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.005032 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f95f-account-create-update-275t8" event={"ID":"46eac665-6761-4ff3-8718-6417ccea545d","Type":"ContainerDied","Data":"634e4bb99cd4f4c0685080200c21099faa5c7885981ac72b80bb0ec74397d192"} Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.006174 4783 generic.go:334] "Generic (PLEG): container finished" podID="149825bf-5cac-45b3-a51f-f569f43fa5d0" containerID="45efa1253dcd16e9db539aabbeb329901132f434749be449e60227f00738a351" exitCode=0 Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.006218 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2s7lh" event={"ID":"149825bf-5cac-45b3-a51f-f569f43fa5d0","Type":"ContainerDied","Data":"45efa1253dcd16e9db539aabbeb329901132f434749be449e60227f00738a351"} Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.007975 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7846c976fc-knpz2" event={"ID":"ff4d96b6-b227-41e4-a653-39b8475aa9de","Type":"ContainerStarted","Data":"0575f6c869faf014dbc719fd55ffacf2b13f6331f71e27fb4c0ebf5b2f38ba82"} Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.008008 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7846c976fc-knpz2" event={"ID":"ff4d96b6-b227-41e4-a653-39b8475aa9de","Type":"ContainerStarted","Data":"0e37d552d6c3f1f3f56d2154612aa1e810b4869ae3c51ff5ae477168461e33de"} Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.008035 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.008066 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.019026 4783 generic.go:334] "Generic (PLEG): container finished" podID="6263f4cc-b742-4057-96a4-d4a058ad3f44" containerID="a0c7adbdcdc3c32262ec0c55d243958625d2a64ec655977717728d348bf879a4" exitCode=0 Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.019206 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-krxr5" event={"ID":"6263f4cc-b742-4057-96a4-d4a058ad3f44","Type":"ContainerDied","Data":"a0c7adbdcdc3c32262ec0c55d243958625d2a64ec655977717728d348bf879a4"} Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.081641 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7846c976fc-knpz2" podStartSLOduration=7.081626154 podStartE2EDuration="7.081626154s" podCreationTimestamp="2026-01-31 09:19:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:56.064453041 +0000 UTC m=+906.733136509" watchObservedRunningTime="2026-01-31 09:19:56.081626154 +0000 UTC m=+906.750309622" Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.328286 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bf84-account-create-update-nq754" Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.527149 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt6ww\" (UniqueName: \"kubernetes.io/projected/1094fbb7-1b91-4925-85b4-c9dafceb46c9-kube-api-access-zt6ww\") pod \"1094fbb7-1b91-4925-85b4-c9dafceb46c9\" (UID: \"1094fbb7-1b91-4925-85b4-c9dafceb46c9\") " Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.527338 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1094fbb7-1b91-4925-85b4-c9dafceb46c9-operator-scripts\") pod \"1094fbb7-1b91-4925-85b4-c9dafceb46c9\" (UID: \"1094fbb7-1b91-4925-85b4-c9dafceb46c9\") " Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.527903 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1094fbb7-1b91-4925-85b4-c9dafceb46c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1094fbb7-1b91-4925-85b4-c9dafceb46c9" (UID: "1094fbb7-1b91-4925-85b4-c9dafceb46c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.531082 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1094fbb7-1b91-4925-85b4-c9dafceb46c9-kube-api-access-zt6ww" (OuterVolumeSpecName: "kube-api-access-zt6ww") pod "1094fbb7-1b91-4925-85b4-c9dafceb46c9" (UID: "1094fbb7-1b91-4925-85b4-c9dafceb46c9"). InnerVolumeSpecName "kube-api-access-zt6ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.629497 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt6ww\" (UniqueName: \"kubernetes.io/projected/1094fbb7-1b91-4925-85b4-c9dafceb46c9-kube-api-access-zt6ww\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:56 crc kubenswrapper[4783]: I0131 09:19:56.629527 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1094fbb7-1b91-4925-85b4-c9dafceb46c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.048849 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bf84-account-create-update-nq754" event={"ID":"1094fbb7-1b91-4925-85b4-c9dafceb46c9","Type":"ContainerDied","Data":"2b30690308e652f38a40f02c6220ea5af58db7fbf7ad2a4c4217340a071ae51a"} Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.049134 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b30690308e652f38a40f02c6220ea5af58db7fbf7ad2a4c4217340a071ae51a" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.048883 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bf84-account-create-update-nq754" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.060363 4783 generic.go:334] "Generic (PLEG): container finished" podID="cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" containerID="5b01fff520a9d67f2f3f01f9adc270009f597046e676c124e99d31cbbc8438f3" exitCode=0 Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.060511 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7","Type":"ContainerDied","Data":"5b01fff520a9d67f2f3f01f9adc270009f597046e676c124e99d31cbbc8438f3"} Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.542848 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2s7lh" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.586289 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pqf78" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.654750 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/149825bf-5cac-45b3-a51f-f569f43fa5d0-operator-scripts\") pod \"149825bf-5cac-45b3-a51f-f569f43fa5d0\" (UID: \"149825bf-5cac-45b3-a51f-f569f43fa5d0\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.654948 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpf6h\" (UniqueName: \"kubernetes.io/projected/149825bf-5cac-45b3-a51f-f569f43fa5d0-kube-api-access-gpf6h\") pod \"149825bf-5cac-45b3-a51f-f569f43fa5d0\" (UID: \"149825bf-5cac-45b3-a51f-f569f43fa5d0\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.655842 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/149825bf-5cac-45b3-a51f-f569f43fa5d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "149825bf-5cac-45b3-a51f-f569f43fa5d0" (UID: "149825bf-5cac-45b3-a51f-f569f43fa5d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.660949 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149825bf-5cac-45b3-a51f-f569f43fa5d0-kube-api-access-gpf6h" (OuterVolumeSpecName: "kube-api-access-gpf6h") pod "149825bf-5cac-45b3-a51f-f569f43fa5d0" (UID: "149825bf-5cac-45b3-a51f-f569f43fa5d0"). InnerVolumeSpecName "kube-api-access-gpf6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.683502 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.692383 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-krxr5" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.715209 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f95f-account-create-update-275t8" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.733385 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6717-account-create-update-p4bsh" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.770766 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86k9g\" (UniqueName: \"kubernetes.io/projected/7764a47f-6ccf-43f1-a787-99db87fb5cfb-kube-api-access-86k9g\") pod \"7764a47f-6ccf-43f1-a787-99db87fb5cfb\" (UID: \"7764a47f-6ccf-43f1-a787-99db87fb5cfb\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.770821 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spfvl\" (UniqueName: \"kubernetes.io/projected/6263f4cc-b742-4057-96a4-d4a058ad3f44-kube-api-access-spfvl\") pod \"6263f4cc-b742-4057-96a4-d4a058ad3f44\" (UID: \"6263f4cc-b742-4057-96a4-d4a058ad3f44\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.770846 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6263f4cc-b742-4057-96a4-d4a058ad3f44-operator-scripts\") pod \"6263f4cc-b742-4057-96a4-d4a058ad3f44\" (UID: \"6263f4cc-b742-4057-96a4-d4a058ad3f44\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.770870 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-public-tls-certs\") pod \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.770903 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-scripts\") pod \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.770938 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-logs\") pod \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.771012 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ed13cd-8052-462d-bbb3-7d2863d38c2e-operator-scripts\") pod \"35ed13cd-8052-462d-bbb3-7d2863d38c2e\" (UID: \"35ed13cd-8052-462d-bbb3-7d2863d38c2e\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.771044 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7764a47f-6ccf-43f1-a787-99db87fb5cfb-operator-scripts\") pod \"7764a47f-6ccf-43f1-a787-99db87fb5cfb\" (UID: \"7764a47f-6ccf-43f1-a787-99db87fb5cfb\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.771115 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-combined-ca-bundle\") pod \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.771134 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-config-data\") pod \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.771232 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2vpz\" (UniqueName: \"kubernetes.io/projected/35ed13cd-8052-462d-bbb3-7d2863d38c2e-kube-api-access-b2vpz\") pod \"35ed13cd-8052-462d-bbb3-7d2863d38c2e\" (UID: \"35ed13cd-8052-462d-bbb3-7d2863d38c2e\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.771260 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46eac665-6761-4ff3-8718-6417ccea545d-operator-scripts\") pod \"46eac665-6761-4ff3-8718-6417ccea545d\" (UID: \"46eac665-6761-4ff3-8718-6417ccea545d\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.771294 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd8hp\" (UniqueName: \"kubernetes.io/projected/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-kube-api-access-xd8hp\") pod \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.771316 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.771377 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfvb6\" (UniqueName: \"kubernetes.io/projected/46eac665-6761-4ff3-8718-6417ccea545d-kube-api-access-wfvb6\") pod \"46eac665-6761-4ff3-8718-6417ccea545d\" (UID: \"46eac665-6761-4ff3-8718-6417ccea545d\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.771407 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-httpd-run\") pod \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\" (UID: \"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.772336 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46eac665-6761-4ff3-8718-6417ccea545d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46eac665-6761-4ff3-8718-6417ccea545d" (UID: "46eac665-6761-4ff3-8718-6417ccea545d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.772587 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7764a47f-6ccf-43f1-a787-99db87fb5cfb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7764a47f-6ccf-43f1-a787-99db87fb5cfb" (UID: "7764a47f-6ccf-43f1-a787-99db87fb5cfb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.772724 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/149825bf-5cac-45b3-a51f-f569f43fa5d0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.772745 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7764a47f-6ccf-43f1-a787-99db87fb5cfb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.772754 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpf6h\" (UniqueName: \"kubernetes.io/projected/149825bf-5cac-45b3-a51f-f569f43fa5d0-kube-api-access-gpf6h\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.772765 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46eac665-6761-4ff3-8718-6417ccea545d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.775664 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" (UID: "cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.777258 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-scripts" (OuterVolumeSpecName: "scripts") pod "cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" (UID: "cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.777961 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-kube-api-access-xd8hp" (OuterVolumeSpecName: "kube-api-access-xd8hp") pod "cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" (UID: "cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7"). InnerVolumeSpecName "kube-api-access-xd8hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.778091 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-logs" (OuterVolumeSpecName: "logs") pod "cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" (UID: "cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.778447 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35ed13cd-8052-462d-bbb3-7d2863d38c2e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35ed13cd-8052-462d-bbb3-7d2863d38c2e" (UID: "35ed13cd-8052-462d-bbb3-7d2863d38c2e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.778535 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6263f4cc-b742-4057-96a4-d4a058ad3f44-kube-api-access-spfvl" (OuterVolumeSpecName: "kube-api-access-spfvl") pod "6263f4cc-b742-4057-96a4-d4a058ad3f44" (UID: "6263f4cc-b742-4057-96a4-d4a058ad3f44"). InnerVolumeSpecName "kube-api-access-spfvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.778827 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6263f4cc-b742-4057-96a4-d4a058ad3f44-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6263f4cc-b742-4057-96a4-d4a058ad3f44" (UID: "6263f4cc-b742-4057-96a4-d4a058ad3f44"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.779484 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7764a47f-6ccf-43f1-a787-99db87fb5cfb-kube-api-access-86k9g" (OuterVolumeSpecName: "kube-api-access-86k9g") pod "7764a47f-6ccf-43f1-a787-99db87fb5cfb" (UID: "7764a47f-6ccf-43f1-a787-99db87fb5cfb"). InnerVolumeSpecName "kube-api-access-86k9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.787479 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46eac665-6761-4ff3-8718-6417ccea545d-kube-api-access-wfvb6" (OuterVolumeSpecName: "kube-api-access-wfvb6") pod "46eac665-6761-4ff3-8718-6417ccea545d" (UID: "46eac665-6761-4ff3-8718-6417ccea545d"). InnerVolumeSpecName "kube-api-access-wfvb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.787548 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" (UID: "cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.795877 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35ed13cd-8052-462d-bbb3-7d2863d38c2e-kube-api-access-b2vpz" (OuterVolumeSpecName: "kube-api-access-b2vpz") pod "35ed13cd-8052-462d-bbb3-7d2863d38c2e" (UID: "35ed13cd-8052-462d-bbb3-7d2863d38c2e"). InnerVolumeSpecName "kube-api-access-b2vpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.820206 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" (UID: "cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.822934 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-config-data" (OuterVolumeSpecName: "config-data") pod "cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" (UID: "cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.824049 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.832188 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" (UID: "cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.873760 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-horizon-secret-key\") pod \"8dda3593-0628-4253-995b-b662d252462e\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.873888 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-horizon-tls-certs\") pod \"8dda3593-0628-4253-995b-b662d252462e\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.873959 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-combined-ca-bundle\") pod \"8dda3593-0628-4253-995b-b662d252462e\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.873991 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dda3593-0628-4253-995b-b662d252462e-config-data\") pod \"8dda3593-0628-4253-995b-b662d252462e\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874017 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dda3593-0628-4253-995b-b662d252462e-scripts\") pod \"8dda3593-0628-4253-995b-b662d252462e\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874045 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7mpg\" (UniqueName: \"kubernetes.io/projected/8dda3593-0628-4253-995b-b662d252462e-kube-api-access-m7mpg\") pod \"8dda3593-0628-4253-995b-b662d252462e\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874076 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dda3593-0628-4253-995b-b662d252462e-logs\") pod \"8dda3593-0628-4253-995b-b662d252462e\" (UID: \"8dda3593-0628-4253-995b-b662d252462e\") " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874335 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874354 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874366 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2vpz\" (UniqueName: \"kubernetes.io/projected/35ed13cd-8052-462d-bbb3-7d2863d38c2e-kube-api-access-b2vpz\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874377 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd8hp\" (UniqueName: \"kubernetes.io/projected/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-kube-api-access-xd8hp\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874400 4783 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874409 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfvb6\" (UniqueName: \"kubernetes.io/projected/46eac665-6761-4ff3-8718-6417ccea545d-kube-api-access-wfvb6\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874418 4783 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874431 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86k9g\" (UniqueName: \"kubernetes.io/projected/7764a47f-6ccf-43f1-a787-99db87fb5cfb-kube-api-access-86k9g\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874440 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spfvl\" (UniqueName: \"kubernetes.io/projected/6263f4cc-b742-4057-96a4-d4a058ad3f44-kube-api-access-spfvl\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874449 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6263f4cc-b742-4057-96a4-d4a058ad3f44-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874457 4783 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874464 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874472 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.874480 4783 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ed13cd-8052-462d-bbb3-7d2863d38c2e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.875731 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dda3593-0628-4253-995b-b662d252462e-logs" (OuterVolumeSpecName: "logs") pod "8dda3593-0628-4253-995b-b662d252462e" (UID: "8dda3593-0628-4253-995b-b662d252462e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.879545 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8dda3593-0628-4253-995b-b662d252462e" (UID: "8dda3593-0628-4253-995b-b662d252462e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.889304 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="95d6fbb2-49fd-4d85-8247-413da61c4c16" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.151:9292/healthcheck\": read tcp 10.217.0.2:55574->10.217.0.151:9292: read: connection reset by peer" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.889428 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="95d6fbb2-49fd-4d85-8247-413da61c4c16" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.151:9292/healthcheck\": read tcp 10.217.0.2:55568->10.217.0.151:9292: read: connection reset by peer" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.892740 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dda3593-0628-4253-995b-b662d252462e-kube-api-access-m7mpg" (OuterVolumeSpecName: "kube-api-access-m7mpg") pod "8dda3593-0628-4253-995b-b662d252462e" (UID: "8dda3593-0628-4253-995b-b662d252462e"). InnerVolumeSpecName "kube-api-access-m7mpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.900187 4783 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.904658 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dda3593-0628-4253-995b-b662d252462e" (UID: "8dda3593-0628-4253-995b-b662d252462e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.911547 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dda3593-0628-4253-995b-b662d252462e-config-data" (OuterVolumeSpecName: "config-data") pod "8dda3593-0628-4253-995b-b662d252462e" (UID: "8dda3593-0628-4253-995b-b662d252462e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.941676 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dda3593-0628-4253-995b-b662d252462e-scripts" (OuterVolumeSpecName: "scripts") pod "8dda3593-0628-4253-995b-b662d252462e" (UID: "8dda3593-0628-4253-995b-b662d252462e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.957928 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "8dda3593-0628-4253-995b-b662d252462e" (UID: "8dda3593-0628-4253-995b-b662d252462e"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.975690 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7mpg\" (UniqueName: \"kubernetes.io/projected/8dda3593-0628-4253-995b-b662d252462e-kube-api-access-m7mpg\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.975720 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dda3593-0628-4253-995b-b662d252462e-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.975733 4783 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.975745 4783 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.975758 4783 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.975768 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dda3593-0628-4253-995b-b662d252462e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.975781 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dda3593-0628-4253-995b-b662d252462e-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:57 crc kubenswrapper[4783]: I0131 09:19:57.975794 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dda3593-0628-4253-995b-b662d252462e-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.071775 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7","Type":"ContainerDied","Data":"4ac707186d7e224d05566301d7bd1db3bc8c84ded3a15f31feb1122387893f31"} Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.071842 4783 scope.go:117] "RemoveContainer" containerID="5b01fff520a9d67f2f3f01f9adc270009f597046e676c124e99d31cbbc8438f3" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.071996 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.084433 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6717-account-create-update-p4bsh" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.084450 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6717-account-create-update-p4bsh" event={"ID":"35ed13cd-8052-462d-bbb3-7d2863d38c2e","Type":"ContainerDied","Data":"7c3d2941941e632bfc5f9e5ba5a4e5330890c144ebd6f1a59fdf54dd43448c44"} Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.084481 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c3d2941941e632bfc5f9e5ba5a4e5330890c144ebd6f1a59fdf54dd43448c44" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.099156 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f95f-account-create-update-275t8" event={"ID":"46eac665-6761-4ff3-8718-6417ccea545d","Type":"ContainerDied","Data":"9bf2612affa8ae4ba33e4c1d4ced70d6814af3f14da8a42e69f758d9bba332e8"} Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.099202 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bf2612affa8ae4ba33e4c1d4ced70d6814af3f14da8a42e69f758d9bba332e8" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.099247 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f95f-account-create-update-275t8" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.104719 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2s7lh" event={"ID":"149825bf-5cac-45b3-a51f-f569f43fa5d0","Type":"ContainerDied","Data":"7a948d5b3c41c612885d8f5940afc7b60735b76104bdefcb8d718c8186b4c7e8"} Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.104744 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a948d5b3c41c612885d8f5940afc7b60735b76104bdefcb8d718c8186b4c7e8" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.104784 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2s7lh" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.119683 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-krxr5" event={"ID":"6263f4cc-b742-4057-96a4-d4a058ad3f44","Type":"ContainerDied","Data":"3e67b98896dd94a61dfbca6f0105b41e1a458aa0fe2009b29f6d03281877ef8e"} Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.119709 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e67b98896dd94a61dfbca6f0105b41e1a458aa0fe2009b29f6d03281877ef8e" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.119741 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-krxr5" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.122782 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pqf78" event={"ID":"7764a47f-6ccf-43f1-a787-99db87fb5cfb","Type":"ContainerDied","Data":"da621f0c6b25449f74357a874ab160b9f5e8e6fd904ac3987073a5c016969689"} Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.122815 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da621f0c6b25449f74357a874ab160b9f5e8e6fd904ac3987073a5c016969689" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.122889 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pqf78" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.131013 4783 generic.go:334] "Generic (PLEG): container finished" podID="8dda3593-0628-4253-995b-b662d252462e" containerID="01d315f893e2e65f63bc946effea328252794d35866a0d1cd4cd689f197bbbb2" exitCode=137 Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.131054 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f5ff596f4-ffmss" event={"ID":"8dda3593-0628-4253-995b-b662d252462e","Type":"ContainerDied","Data":"01d315f893e2e65f63bc946effea328252794d35866a0d1cd4cd689f197bbbb2"} Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.131073 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f5ff596f4-ffmss" event={"ID":"8dda3593-0628-4253-995b-b662d252462e","Type":"ContainerDied","Data":"1eb81414a433df5bc0455962382e90f25a71ff960b75ec7400c8715af6aedb23"} Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.131113 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f5ff596f4-ffmss" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.146130 4783 generic.go:334] "Generic (PLEG): container finished" podID="95d6fbb2-49fd-4d85-8247-413da61c4c16" containerID="467bb4b3b850268e9cc8f923dd2680181e0341456bb9481b712e6537c005e9b3" exitCode=0 Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.146172 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95d6fbb2-49fd-4d85-8247-413da61c4c16","Type":"ContainerDied","Data":"467bb4b3b850268e9cc8f923dd2680181e0341456bb9481b712e6537c005e9b3"} Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.189298 4783 scope.go:117] "RemoveContainer" containerID="c62b41e75fbb1a433cfb1c2dff030c8b66835350f6bab80fb2356f17562c3834" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.191554 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.243591 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.259186 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.260230 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56bc0e52-0e88-44a1-a30e-e8b3bafba9ce" containerName="registry-server" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260264 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="56bc0e52-0e88-44a1-a30e-e8b3bafba9ce" containerName="registry-server" Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.260281 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dda3593-0628-4253-995b-b662d252462e" containerName="horizon" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260288 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dda3593-0628-4253-995b-b662d252462e" containerName="horizon" Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.260308 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1094fbb7-1b91-4925-85b4-c9dafceb46c9" containerName="mariadb-account-create-update" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260314 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="1094fbb7-1b91-4925-85b4-c9dafceb46c9" containerName="mariadb-account-create-update" Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.260335 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5f7b62-eadb-483b-b336-34e1ba8e881b" containerName="neutron-api" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260343 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5f7b62-eadb-483b-b336-34e1ba8e881b" containerName="neutron-api" Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.260364 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" containerName="glance-httpd" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260370 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" containerName="glance-httpd" Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.260383 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149825bf-5cac-45b3-a51f-f569f43fa5d0" containerName="mariadb-database-create" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260390 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="149825bf-5cac-45b3-a51f-f569f43fa5d0" containerName="mariadb-database-create" Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.260413 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ed13cd-8052-462d-bbb3-7d2863d38c2e" containerName="mariadb-account-create-update" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260419 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ed13cd-8052-462d-bbb3-7d2863d38c2e" containerName="mariadb-account-create-update" Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.260427 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56bc0e52-0e88-44a1-a30e-e8b3bafba9ce" containerName="extract-content" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260432 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="56bc0e52-0e88-44a1-a30e-e8b3bafba9ce" containerName="extract-content" Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.260446 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5f7b62-eadb-483b-b336-34e1ba8e881b" containerName="neutron-httpd" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260453 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5f7b62-eadb-483b-b336-34e1ba8e881b" containerName="neutron-httpd" Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.260465 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56bc0e52-0e88-44a1-a30e-e8b3bafba9ce" containerName="extract-utilities" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260475 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="56bc0e52-0e88-44a1-a30e-e8b3bafba9ce" containerName="extract-utilities" Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.260504 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6263f4cc-b742-4057-96a4-d4a058ad3f44" containerName="mariadb-database-create" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260513 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="6263f4cc-b742-4057-96a4-d4a058ad3f44" containerName="mariadb-database-create" Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.260527 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46eac665-6761-4ff3-8718-6417ccea545d" containerName="mariadb-account-create-update" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260533 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="46eac665-6761-4ff3-8718-6417ccea545d" containerName="mariadb-account-create-update" Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.260556 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dda3593-0628-4253-995b-b662d252462e" containerName="horizon-log" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260562 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dda3593-0628-4253-995b-b662d252462e" containerName="horizon-log" Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.260578 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7764a47f-6ccf-43f1-a787-99db87fb5cfb" containerName="mariadb-database-create" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260584 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="7764a47f-6ccf-43f1-a787-99db87fb5cfb" containerName="mariadb-database-create" Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.260601 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" containerName="glance-log" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260607 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" containerName="glance-log" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260924 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dda3593-0628-4253-995b-b662d252462e" containerName="horizon" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260938 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5f7b62-eadb-483b-b336-34e1ba8e881b" containerName="neutron-httpd" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260950 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" containerName="glance-httpd" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260972 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dda3593-0628-4253-995b-b662d252462e" containerName="horizon-log" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260981 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="46eac665-6761-4ff3-8718-6417ccea545d" containerName="mariadb-account-create-update" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.260993 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="149825bf-5cac-45b3-a51f-f569f43fa5d0" containerName="mariadb-database-create" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.261010 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="6263f4cc-b742-4057-96a4-d4a058ad3f44" containerName="mariadb-database-create" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.261027 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ed13cd-8052-462d-bbb3-7d2863d38c2e" containerName="mariadb-account-create-update" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.261034 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5f7b62-eadb-483b-b336-34e1ba8e881b" containerName="neutron-api" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.261045 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="7764a47f-6ccf-43f1-a787-99db87fb5cfb" containerName="mariadb-database-create" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.261059 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="1094fbb7-1b91-4925-85b4-c9dafceb46c9" containerName="mariadb-account-create-update" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.261071 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" containerName="glance-log" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.261085 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="56bc0e52-0e88-44a1-a30e-e8b3bafba9ce" containerName="registry-server" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.300908 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.303945 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f5ff596f4-ffmss"] Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.306869 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.307940 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.321281 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-f5ff596f4-ffmss"] Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.324733 4783 scope.go:117] "RemoveContainer" containerID="fdb0556ca619eb1273ab3de64979e36b7cd85f9ae88fac7da9dbe1278cce5458" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.330966 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.345506 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.490073 4783 scope.go:117] "RemoveContainer" containerID="01d315f893e2e65f63bc946effea328252794d35866a0d1cd4cd689f197bbbb2" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.491795 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95d6fbb2-49fd-4d85-8247-413da61c4c16-httpd-run\") pod \"95d6fbb2-49fd-4d85-8247-413da61c4c16\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.492209 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95d6fbb2-49fd-4d85-8247-413da61c4c16-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "95d6fbb2-49fd-4d85-8247-413da61c4c16" (UID: "95d6fbb2-49fd-4d85-8247-413da61c4c16"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.492258 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95d6fbb2-49fd-4d85-8247-413da61c4c16-logs\") pod \"95d6fbb2-49fd-4d85-8247-413da61c4c16\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.492474 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95d6fbb2-49fd-4d85-8247-413da61c4c16-logs" (OuterVolumeSpecName: "logs") pod "95d6fbb2-49fd-4d85-8247-413da61c4c16" (UID: "95d6fbb2-49fd-4d85-8247-413da61c4c16"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.492605 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-internal-tls-certs\") pod \"95d6fbb2-49fd-4d85-8247-413da61c4c16\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.493053 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"95d6fbb2-49fd-4d85-8247-413da61c4c16\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.493113 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-scripts\") pod \"95d6fbb2-49fd-4d85-8247-413da61c4c16\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.493135 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-combined-ca-bundle\") pod \"95d6fbb2-49fd-4d85-8247-413da61c4c16\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.493498 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkqw7\" (UniqueName: \"kubernetes.io/projected/95d6fbb2-49fd-4d85-8247-413da61c4c16-kube-api-access-zkqw7\") pod \"95d6fbb2-49fd-4d85-8247-413da61c4c16\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.493538 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-config-data\") pod \"95d6fbb2-49fd-4d85-8247-413da61c4c16\" (UID: \"95d6fbb2-49fd-4d85-8247-413da61c4c16\") " Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.493781 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-logs\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.493854 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.493922 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.493971 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hcdn\" (UniqueName: \"kubernetes.io/projected/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-kube-api-access-7hcdn\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.493998 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-scripts\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.494125 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.494313 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.494391 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-config-data\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.495650 4783 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95d6fbb2-49fd-4d85-8247-413da61c4c16-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.495674 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95d6fbb2-49fd-4d85-8247-413da61c4c16-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.497385 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "95d6fbb2-49fd-4d85-8247-413da61c4c16" (UID: "95d6fbb2-49fd-4d85-8247-413da61c4c16"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.499272 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-scripts" (OuterVolumeSpecName: "scripts") pod "95d6fbb2-49fd-4d85-8247-413da61c4c16" (UID: "95d6fbb2-49fd-4d85-8247-413da61c4c16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.506614 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d6fbb2-49fd-4d85-8247-413da61c4c16-kube-api-access-zkqw7" (OuterVolumeSpecName: "kube-api-access-zkqw7") pod "95d6fbb2-49fd-4d85-8247-413da61c4c16" (UID: "95d6fbb2-49fd-4d85-8247-413da61c4c16"). InnerVolumeSpecName "kube-api-access-zkqw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.521449 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95d6fbb2-49fd-4d85-8247-413da61c4c16" (UID: "95d6fbb2-49fd-4d85-8247-413da61c4c16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.531052 4783 scope.go:117] "RemoveContainer" containerID="fdb0556ca619eb1273ab3de64979e36b7cd85f9ae88fac7da9dbe1278cce5458" Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.531522 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb0556ca619eb1273ab3de64979e36b7cd85f9ae88fac7da9dbe1278cce5458\": container with ID starting with fdb0556ca619eb1273ab3de64979e36b7cd85f9ae88fac7da9dbe1278cce5458 not found: ID does not exist" containerID="fdb0556ca619eb1273ab3de64979e36b7cd85f9ae88fac7da9dbe1278cce5458" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.531557 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb0556ca619eb1273ab3de64979e36b7cd85f9ae88fac7da9dbe1278cce5458"} err="failed to get container status \"fdb0556ca619eb1273ab3de64979e36b7cd85f9ae88fac7da9dbe1278cce5458\": rpc error: code = NotFound desc = could not find container \"fdb0556ca619eb1273ab3de64979e36b7cd85f9ae88fac7da9dbe1278cce5458\": container with ID starting with fdb0556ca619eb1273ab3de64979e36b7cd85f9ae88fac7da9dbe1278cce5458 not found: ID does not exist" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.531585 4783 scope.go:117] "RemoveContainer" containerID="01d315f893e2e65f63bc946effea328252794d35866a0d1cd4cd689f197bbbb2" Jan 31 09:19:58 crc kubenswrapper[4783]: E0131 09:19:58.531840 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01d315f893e2e65f63bc946effea328252794d35866a0d1cd4cd689f197bbbb2\": container with ID starting with 01d315f893e2e65f63bc946effea328252794d35866a0d1cd4cd689f197bbbb2 not found: ID does not exist" containerID="01d315f893e2e65f63bc946effea328252794d35866a0d1cd4cd689f197bbbb2" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.531861 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01d315f893e2e65f63bc946effea328252794d35866a0d1cd4cd689f197bbbb2"} err="failed to get container status \"01d315f893e2e65f63bc946effea328252794d35866a0d1cd4cd689f197bbbb2\": rpc error: code = NotFound desc = could not find container \"01d315f893e2e65f63bc946effea328252794d35866a0d1cd4cd689f197bbbb2\": container with ID starting with 01d315f893e2e65f63bc946effea328252794d35866a0d1cd4cd689f197bbbb2 not found: ID does not exist" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.541274 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "95d6fbb2-49fd-4d85-8247-413da61c4c16" (UID: "95d6fbb2-49fd-4d85-8247-413da61c4c16"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.577142 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-config-data" (OuterVolumeSpecName: "config-data") pod "95d6fbb2-49fd-4d85-8247-413da61c4c16" (UID: "95d6fbb2-49fd-4d85-8247-413da61c4c16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.597694 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.597784 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hcdn\" (UniqueName: \"kubernetes.io/projected/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-kube-api-access-7hcdn\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.597818 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-scripts\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.597841 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.597885 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.597912 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-config-data\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.597935 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-logs\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.597954 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.598025 4783 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.598051 4783 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.598060 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.598069 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.598078 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkqw7\" (UniqueName: \"kubernetes.io/projected/95d6fbb2-49fd-4d85-8247-413da61c4c16-kube-api-access-zkqw7\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.598087 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d6fbb2-49fd-4d85-8247-413da61c4c16-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.598442 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.598681 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-logs\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.598759 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.603461 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-scripts\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.604709 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.605577 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-config-data\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.613926 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.625961 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hcdn\" (UniqueName: \"kubernetes.io/projected/7b9825cb-8bd2-446b-80ab-d6bdd294d51d-kube-api-access-7hcdn\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.638073 4783 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.640682 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7b9825cb-8bd2-446b-80ab-d6bdd294d51d\") " pod="openstack/glance-default-external-api-0" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.699969 4783 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:58 crc kubenswrapper[4783]: I0131 09:19:58.934234 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.171230 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95d6fbb2-49fd-4d85-8247-413da61c4c16","Type":"ContainerDied","Data":"1c9220263ededa3a8891d0c4a04034a3da6fa299dcdbcba6c831dcdb34a47597"} Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.171527 4783 scope.go:117] "RemoveContainer" containerID="467bb4b3b850268e9cc8f923dd2680181e0341456bb9481b712e6537c005e9b3" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.171656 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.182563 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd14d4b5-f44b-47e2-b273-e05e6e9047e0","Type":"ContainerStarted","Data":"bc418d7ea41fe8cafb8af1aca3f9c05406e197865366c38d4ccf40051683efaa"} Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.182814 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerName="ceilometer-central-agent" containerID="cri-o://a71d6a05ce109dbdccb37036d01610f4012d7012b7ea8b4ec54a99d5cd3cded3" gracePeriod=30 Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.183152 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.183602 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerName="proxy-httpd" containerID="cri-o://bc418d7ea41fe8cafb8af1aca3f9c05406e197865366c38d4ccf40051683efaa" gracePeriod=30 Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.183663 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerName="sg-core" containerID="cri-o://481b3e6775785dd5c4702178be7a31cc5636ae5625de30d223f4ef4b3b479a60" gracePeriod=30 Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.183700 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerName="ceilometer-notification-agent" containerID="cri-o://ae2397cedc0d74e494b02de269b0ebdf09364ed0ceed643b9d190ce542b7c41c" gracePeriod=30 Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.210210 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.749498846 podStartE2EDuration="13.210197718s" podCreationTimestamp="2026-01-31 09:19:46 +0000 UTC" firstStartedPulling="2026-01-31 09:19:47.607807841 +0000 UTC m=+898.276491309" lastFinishedPulling="2026-01-31 09:19:58.068506713 +0000 UTC m=+908.737190181" observedRunningTime="2026-01-31 09:19:59.200771768 +0000 UTC m=+909.869455236" watchObservedRunningTime="2026-01-31 09:19:59.210197718 +0000 UTC m=+909.878881186" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.214637 4783 scope.go:117] "RemoveContainer" containerID="7b93ba43db8cca037d4ec83f5e78efbed6d4e07779129bd77b31e7f73945096c" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.228376 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.244704 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.279394 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 09:19:59 crc kubenswrapper[4783]: E0131 09:19:59.280036 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d6fbb2-49fd-4d85-8247-413da61c4c16" containerName="glance-httpd" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.280057 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d6fbb2-49fd-4d85-8247-413da61c4c16" containerName="glance-httpd" Jan 31 09:19:59 crc kubenswrapper[4783]: E0131 09:19:59.280120 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d6fbb2-49fd-4d85-8247-413da61c4c16" containerName="glance-log" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.280130 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d6fbb2-49fd-4d85-8247-413da61c4c16" containerName="glance-log" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.280513 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d6fbb2-49fd-4d85-8247-413da61c4c16" containerName="glance-httpd" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.280540 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d6fbb2-49fd-4d85-8247-413da61c4c16" containerName="glance-log" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.281796 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.285964 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.287634 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.290215 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.420875 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4aab5d-1107-4452-9f07-fc45f446eb01-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.421142 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4aab5d-1107-4452-9f07-fc45f446eb01-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.421182 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.421235 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c4aab5d-1107-4452-9f07-fc45f446eb01-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.421256 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c4aab5d-1107-4452-9f07-fc45f446eb01-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.421281 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c4aab5d-1107-4452-9f07-fc45f446eb01-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.421307 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b82nj\" (UniqueName: \"kubernetes.io/projected/6c4aab5d-1107-4452-9f07-fc45f446eb01-kube-api-access-b82nj\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.421345 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c4aab5d-1107-4452-9f07-fc45f446eb01-logs\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.522768 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c4aab5d-1107-4452-9f07-fc45f446eb01-logs\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.522832 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4aab5d-1107-4452-9f07-fc45f446eb01-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.522902 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4aab5d-1107-4452-9f07-fc45f446eb01-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.522928 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.522989 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c4aab5d-1107-4452-9f07-fc45f446eb01-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.523029 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c4aab5d-1107-4452-9f07-fc45f446eb01-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.523057 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c4aab5d-1107-4452-9f07-fc45f446eb01-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.523108 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b82nj\" (UniqueName: \"kubernetes.io/projected/6c4aab5d-1107-4452-9f07-fc45f446eb01-kube-api-access-b82nj\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.523358 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.531483 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4aab5d-1107-4452-9f07-fc45f446eb01-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.538697 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c4aab5d-1107-4452-9f07-fc45f446eb01-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.538696 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4aab5d-1107-4452-9f07-fc45f446eb01-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.542749 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c4aab5d-1107-4452-9f07-fc45f446eb01-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.547460 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c4aab5d-1107-4452-9f07-fc45f446eb01-logs\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.549822 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b82nj\" (UniqueName: \"kubernetes.io/projected/6c4aab5d-1107-4452-9f07-fc45f446eb01-kube-api-access-b82nj\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.550080 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c4aab5d-1107-4452-9f07-fc45f446eb01-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.573235 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.589648 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"6c4aab5d-1107-4452-9f07-fc45f446eb01\") " pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.629303 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.670890 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dda3593-0628-4253-995b-b662d252462e" path="/var/lib/kubelet/pods/8dda3593-0628-4253-995b-b662d252462e/volumes" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.671885 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d6fbb2-49fd-4d85-8247-413da61c4c16" path="/var/lib/kubelet/pods/95d6fbb2-49fd-4d85-8247-413da61c4c16/volumes" Jan 31 09:19:59 crc kubenswrapper[4783]: I0131 09:19:59.672653 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7" path="/var/lib/kubelet/pods/cbc61a4b-0a7b-4172-a7e9-41dfedb5f8f7/volumes" Jan 31 09:20:00 crc kubenswrapper[4783]: I0131 09:20:00.193109 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 09:20:00 crc kubenswrapper[4783]: I0131 09:20:00.207438 4783 generic.go:334] "Generic (PLEG): container finished" podID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerID="bc418d7ea41fe8cafb8af1aca3f9c05406e197865366c38d4ccf40051683efaa" exitCode=0 Jan 31 09:20:00 crc kubenswrapper[4783]: I0131 09:20:00.207476 4783 generic.go:334] "Generic (PLEG): container finished" podID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerID="481b3e6775785dd5c4702178be7a31cc5636ae5625de30d223f4ef4b3b479a60" exitCode=2 Jan 31 09:20:00 crc kubenswrapper[4783]: I0131 09:20:00.207499 4783 generic.go:334] "Generic (PLEG): container finished" podID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerID="ae2397cedc0d74e494b02de269b0ebdf09364ed0ceed643b9d190ce542b7c41c" exitCode=0 Jan 31 09:20:00 crc kubenswrapper[4783]: I0131 09:20:00.207568 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd14d4b5-f44b-47e2-b273-e05e6e9047e0","Type":"ContainerDied","Data":"bc418d7ea41fe8cafb8af1aca3f9c05406e197865366c38d4ccf40051683efaa"} Jan 31 09:20:00 crc kubenswrapper[4783]: I0131 09:20:00.207604 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd14d4b5-f44b-47e2-b273-e05e6e9047e0","Type":"ContainerDied","Data":"481b3e6775785dd5c4702178be7a31cc5636ae5625de30d223f4ef4b3b479a60"} Jan 31 09:20:00 crc kubenswrapper[4783]: I0131 09:20:00.207618 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd14d4b5-f44b-47e2-b273-e05e6e9047e0","Type":"ContainerDied","Data":"ae2397cedc0d74e494b02de269b0ebdf09364ed0ceed643b9d190ce542b7c41c"} Jan 31 09:20:00 crc kubenswrapper[4783]: I0131 09:20:00.210740 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b9825cb-8bd2-446b-80ab-d6bdd294d51d","Type":"ContainerStarted","Data":"14577793799b76cf98877a27d99788e64b9edc34a6ea8f013eec22b54a923c00"} Jan 31 09:20:00 crc kubenswrapper[4783]: I0131 09:20:00.210815 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b9825cb-8bd2-446b-80ab-d6bdd294d51d","Type":"ContainerStarted","Data":"d31d91011220865d61595f7870e4640b3e51aa5a290eb7a11b2fa89756d0b19a"} Jan 31 09:20:00 crc kubenswrapper[4783]: I0131 09:20:00.883731 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cwssw"] Jan 31 09:20:00 crc kubenswrapper[4783]: I0131 09:20:00.889639 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cwssw" Jan 31 09:20:00 crc kubenswrapper[4783]: I0131 09:20:00.892277 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kv5l5" Jan 31 09:20:00 crc kubenswrapper[4783]: I0131 09:20:00.892635 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 31 09:20:00 crc kubenswrapper[4783]: I0131 09:20:00.892792 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 31 09:20:00 crc kubenswrapper[4783]: I0131 09:20:00.909266 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cwssw"] Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.065633 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-config-data\") pod \"nova-cell0-conductor-db-sync-cwssw\" (UID: \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\") " pod="openstack/nova-cell0-conductor-db-sync-cwssw" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.065852 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cwssw\" (UID: \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\") " pod="openstack/nova-cell0-conductor-db-sync-cwssw" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.066132 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xgjn\" (UniqueName: \"kubernetes.io/projected/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-kube-api-access-7xgjn\") pod \"nova-cell0-conductor-db-sync-cwssw\" (UID: \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\") " pod="openstack/nova-cell0-conductor-db-sync-cwssw" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.066238 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-scripts\") pod \"nova-cell0-conductor-db-sync-cwssw\" (UID: \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\") " pod="openstack/nova-cell0-conductor-db-sync-cwssw" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.167517 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cwssw\" (UID: \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\") " pod="openstack/nova-cell0-conductor-db-sync-cwssw" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.167633 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xgjn\" (UniqueName: \"kubernetes.io/projected/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-kube-api-access-7xgjn\") pod \"nova-cell0-conductor-db-sync-cwssw\" (UID: \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\") " pod="openstack/nova-cell0-conductor-db-sync-cwssw" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.167673 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-scripts\") pod \"nova-cell0-conductor-db-sync-cwssw\" (UID: \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\") " pod="openstack/nova-cell0-conductor-db-sync-cwssw" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.167712 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-config-data\") pod \"nova-cell0-conductor-db-sync-cwssw\" (UID: \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\") " pod="openstack/nova-cell0-conductor-db-sync-cwssw" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.172534 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-config-data\") pod \"nova-cell0-conductor-db-sync-cwssw\" (UID: \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\") " pod="openstack/nova-cell0-conductor-db-sync-cwssw" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.172809 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-scripts\") pod \"nova-cell0-conductor-db-sync-cwssw\" (UID: \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\") " pod="openstack/nova-cell0-conductor-db-sync-cwssw" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.174037 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cwssw\" (UID: \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\") " pod="openstack/nova-cell0-conductor-db-sync-cwssw" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.187130 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xgjn\" (UniqueName: \"kubernetes.io/projected/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-kube-api-access-7xgjn\") pod \"nova-cell0-conductor-db-sync-cwssw\" (UID: \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\") " pod="openstack/nova-cell0-conductor-db-sync-cwssw" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.223097 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b9825cb-8bd2-446b-80ab-d6bdd294d51d","Type":"ContainerStarted","Data":"caa4e0ec513cb361c72d87e5357c7e37438ce43fc0eedd6b5a9e3ad26cc2ec15"} Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.227065 4783 generic.go:334] "Generic (PLEG): container finished" podID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerID="a71d6a05ce109dbdccb37036d01610f4012d7012b7ea8b4ec54a99d5cd3cded3" exitCode=0 Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.227121 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd14d4b5-f44b-47e2-b273-e05e6e9047e0","Type":"ContainerDied","Data":"a71d6a05ce109dbdccb37036d01610f4012d7012b7ea8b4ec54a99d5cd3cded3"} Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.228669 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6c4aab5d-1107-4452-9f07-fc45f446eb01","Type":"ContainerStarted","Data":"8dfbcf2dab7acea87e0205e5ef5017d328456c0199753371f289d231baccc3c2"} Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.228700 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6c4aab5d-1107-4452-9f07-fc45f446eb01","Type":"ContainerStarted","Data":"fd5fdf00ad0217901db08414695c40b91d9459442ad56be47fc0c7b11c924f3c"} Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.251503 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.251456956 podStartE2EDuration="3.251456956s" podCreationTimestamp="2026-01-31 09:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:01.248566867 +0000 UTC m=+911.917250336" watchObservedRunningTime="2026-01-31 09:20:01.251456956 +0000 UTC m=+911.920140424" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.259592 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cwssw" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.530427 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.678598 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks4bx\" (UniqueName: \"kubernetes.io/projected/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-kube-api-access-ks4bx\") pod \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.678732 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-log-httpd\") pod \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.678873 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-combined-ca-bundle\") pod \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.678960 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-config-data\") pod \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.679188 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-sg-core-conf-yaml\") pod \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.679218 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-scripts\") pod \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.679406 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-run-httpd\") pod \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\" (UID: \"fd14d4b5-f44b-47e2-b273-e05e6e9047e0\") " Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.680595 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fd14d4b5-f44b-47e2-b273-e05e6e9047e0" (UID: "fd14d4b5-f44b-47e2-b273-e05e6e9047e0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.704389 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fd14d4b5-f44b-47e2-b273-e05e6e9047e0" (UID: "fd14d4b5-f44b-47e2-b273-e05e6e9047e0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.711345 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-kube-api-access-ks4bx" (OuterVolumeSpecName: "kube-api-access-ks4bx") pod "fd14d4b5-f44b-47e2-b273-e05e6e9047e0" (UID: "fd14d4b5-f44b-47e2-b273-e05e6e9047e0"). InnerVolumeSpecName "kube-api-access-ks4bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.713055 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-scripts" (OuterVolumeSpecName: "scripts") pod "fd14d4b5-f44b-47e2-b273-e05e6e9047e0" (UID: "fd14d4b5-f44b-47e2-b273-e05e6e9047e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.735610 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fd14d4b5-f44b-47e2-b273-e05e6e9047e0" (UID: "fd14d4b5-f44b-47e2-b273-e05e6e9047e0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.771819 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd14d4b5-f44b-47e2-b273-e05e6e9047e0" (UID: "fd14d4b5-f44b-47e2-b273-e05e6e9047e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.782069 4783 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.782097 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.782107 4783 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.782118 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks4bx\" (UniqueName: \"kubernetes.io/projected/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-kube-api-access-ks4bx\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.782130 4783 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.782138 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.793795 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-config-data" (OuterVolumeSpecName: "config-data") pod "fd14d4b5-f44b-47e2-b273-e05e6e9047e0" (UID: "fd14d4b5-f44b-47e2-b273-e05e6e9047e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.808903 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cwssw"] Jan 31 09:20:01 crc kubenswrapper[4783]: I0131 09:20:01.883043 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd14d4b5-f44b-47e2-b273-e05e6e9047e0-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.263472 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd14d4b5-f44b-47e2-b273-e05e6e9047e0","Type":"ContainerDied","Data":"fc0674cad5e4cabc301332a9ad031ac7593799fe8b2a40102c405196626bda97"} Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.263516 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.263552 4783 scope.go:117] "RemoveContainer" containerID="bc418d7ea41fe8cafb8af1aca3f9c05406e197865366c38d4ccf40051683efaa" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.268936 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cwssw" event={"ID":"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef","Type":"ContainerStarted","Data":"da822e6422e728c880a8930284618b0ec377febdbe6f9b73ec17cba7d63b6b07"} Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.272910 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6c4aab5d-1107-4452-9f07-fc45f446eb01","Type":"ContainerStarted","Data":"eb559fc8fc78c982ef32b1776441bd0779214438ddf97b9ff18d770666165d88"} Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.297065 4783 scope.go:117] "RemoveContainer" containerID="481b3e6775785dd5c4702178be7a31cc5636ae5625de30d223f4ef4b3b479a60" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.331620 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.331606096 podStartE2EDuration="3.331606096s" podCreationTimestamp="2026-01-31 09:19:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:02.295360619 +0000 UTC m=+912.964044087" watchObservedRunningTime="2026-01-31 09:20:02.331606096 +0000 UTC m=+913.000289554" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.337324 4783 scope.go:117] "RemoveContainer" containerID="ae2397cedc0d74e494b02de269b0ebdf09364ed0ceed643b9d190ce542b7c41c" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.349224 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.375293 4783 scope.go:117] "RemoveContainer" containerID="a71d6a05ce109dbdccb37036d01610f4012d7012b7ea8b4ec54a99d5cd3cded3" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.377711 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.399330 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:02 crc kubenswrapper[4783]: E0131 09:20:02.399948 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerName="proxy-httpd" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.400028 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerName="proxy-httpd" Jan 31 09:20:02 crc kubenswrapper[4783]: E0131 09:20:02.400114 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerName="ceilometer-notification-agent" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.400201 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerName="ceilometer-notification-agent" Jan 31 09:20:02 crc kubenswrapper[4783]: E0131 09:20:02.400296 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerName="ceilometer-central-agent" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.400366 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerName="ceilometer-central-agent" Jan 31 09:20:02 crc kubenswrapper[4783]: E0131 09:20:02.400436 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerName="sg-core" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.400487 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerName="sg-core" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.400812 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerName="proxy-httpd" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.400897 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerName="ceilometer-notification-agent" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.400978 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerName="sg-core" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.401043 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" containerName="ceilometer-central-agent" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.402874 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.412955 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.413116 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.443055 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.506427 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12de2fc0-cdd7-4abb-8892-bded5646da4a-log-httpd\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.506747 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.506772 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12de2fc0-cdd7-4abb-8892-bded5646da4a-run-httpd\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.506812 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-config-data\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.506924 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-scripts\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.507042 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.507072 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbxck\" (UniqueName: \"kubernetes.io/projected/12de2fc0-cdd7-4abb-8892-bded5646da4a-kube-api-access-fbxck\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.608105 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.608147 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbxck\" (UniqueName: \"kubernetes.io/projected/12de2fc0-cdd7-4abb-8892-bded5646da4a-kube-api-access-fbxck\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.608228 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12de2fc0-cdd7-4abb-8892-bded5646da4a-log-httpd\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.608257 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.608271 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12de2fc0-cdd7-4abb-8892-bded5646da4a-run-httpd\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.608290 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-config-data\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.608337 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-scripts\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.609058 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12de2fc0-cdd7-4abb-8892-bded5646da4a-log-httpd\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.609077 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12de2fc0-cdd7-4abb-8892-bded5646da4a-run-httpd\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.614049 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.614803 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-scripts\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.615367 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-config-data\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.626900 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.637683 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbxck\" (UniqueName: \"kubernetes.io/projected/12de2fc0-cdd7-4abb-8892-bded5646da4a-kube-api-access-fbxck\") pod \"ceilometer-0\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " pod="openstack/ceilometer-0" Jan 31 09:20:02 crc kubenswrapper[4783]: I0131 09:20:02.743689 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:20:03 crc kubenswrapper[4783]: I0131 09:20:03.176139 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:03 crc kubenswrapper[4783]: W0131 09:20:03.185849 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12de2fc0_cdd7_4abb_8892_bded5646da4a.slice/crio-0e8c7685b2546c42617c3430ce10d871682dccf02b7b6c1c2d13bfcce58304cc WatchSource:0}: Error finding container 0e8c7685b2546c42617c3430ce10d871682dccf02b7b6c1c2d13bfcce58304cc: Status 404 returned error can't find the container with id 0e8c7685b2546c42617c3430ce10d871682dccf02b7b6c1c2d13bfcce58304cc Jan 31 09:20:03 crc kubenswrapper[4783]: I0131 09:20:03.286724 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12de2fc0-cdd7-4abb-8892-bded5646da4a","Type":"ContainerStarted","Data":"0e8c7685b2546c42617c3430ce10d871682dccf02b7b6c1c2d13bfcce58304cc"} Jan 31 09:20:03 crc kubenswrapper[4783]: I0131 09:20:03.671800 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd14d4b5-f44b-47e2-b273-e05e6e9047e0" path="/var/lib/kubelet/pods/fd14d4b5-f44b-47e2-b273-e05e6e9047e0/volumes" Jan 31 09:20:04 crc kubenswrapper[4783]: I0131 09:20:04.297452 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12de2fc0-cdd7-4abb-8892-bded5646da4a","Type":"ContainerStarted","Data":"25f252f42507381d6f5df07d00a97adb36d64680b772b24263a37377b71669de"} Jan 31 09:20:04 crc kubenswrapper[4783]: I0131 09:20:04.758817 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:20:04 crc kubenswrapper[4783]: I0131 09:20:04.759794 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7846c976fc-knpz2" Jan 31 09:20:05 crc kubenswrapper[4783]: I0131 09:20:05.308546 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12de2fc0-cdd7-4abb-8892-bded5646da4a","Type":"ContainerStarted","Data":"27bb4b007fad80efe3e83d6928cd74a46be48dac1fa3a5ed7437888faa0230ee"} Jan 31 09:20:08 crc kubenswrapper[4783]: I0131 09:20:08.339210 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:08 crc kubenswrapper[4783]: I0131 09:20:08.934563 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 31 09:20:08 crc kubenswrapper[4783]: I0131 09:20:08.934889 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 31 09:20:08 crc kubenswrapper[4783]: I0131 09:20:08.974655 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 31 09:20:08 crc kubenswrapper[4783]: I0131 09:20:08.988570 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 31 09:20:09 crc kubenswrapper[4783]: I0131 09:20:09.352585 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 31 09:20:09 crc kubenswrapper[4783]: I0131 09:20:09.353448 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 31 09:20:09 crc kubenswrapper[4783]: I0131 09:20:09.630409 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 31 09:20:09 crc kubenswrapper[4783]: I0131 09:20:09.630469 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 31 09:20:09 crc kubenswrapper[4783]: I0131 09:20:09.679953 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 31 09:20:09 crc kubenswrapper[4783]: I0131 09:20:09.680272 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 31 09:20:10 crc kubenswrapper[4783]: I0131 09:20:10.366861 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12de2fc0-cdd7-4abb-8892-bded5646da4a","Type":"ContainerStarted","Data":"baacdff548290bc50606fc3d44d8f5dae37a1e17a3ec81027d7b46ba75025d85"} Jan 31 09:20:10 crc kubenswrapper[4783]: I0131 09:20:10.370449 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cwssw" event={"ID":"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef","Type":"ContainerStarted","Data":"fbd9072a8d3842a95774cac15afdcedf7d6510f10e98405f7df9cb200014801e"} Jan 31 09:20:10 crc kubenswrapper[4783]: I0131 09:20:10.370914 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 31 09:20:10 crc kubenswrapper[4783]: I0131 09:20:10.370945 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 31 09:20:10 crc kubenswrapper[4783]: I0131 09:20:10.386741 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-cwssw" podStartSLOduration=2.2680602739999998 podStartE2EDuration="10.386725182s" podCreationTimestamp="2026-01-31 09:20:00 +0000 UTC" firstStartedPulling="2026-01-31 09:20:01.817869522 +0000 UTC m=+912.486552989" lastFinishedPulling="2026-01-31 09:20:09.936534429 +0000 UTC m=+920.605217897" observedRunningTime="2026-01-31 09:20:10.384137645 +0000 UTC m=+921.052821112" watchObservedRunningTime="2026-01-31 09:20:10.386725182 +0000 UTC m=+921.055408650" Jan 31 09:20:11 crc kubenswrapper[4783]: I0131 09:20:11.163422 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 31 09:20:11 crc kubenswrapper[4783]: I0131 09:20:11.165381 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 31 09:20:12 crc kubenswrapper[4783]: I0131 09:20:12.087143 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 31 09:20:12 crc kubenswrapper[4783]: I0131 09:20:12.092769 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 31 09:20:12 crc kubenswrapper[4783]: I0131 09:20:12.390304 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12de2fc0-cdd7-4abb-8892-bded5646da4a","Type":"ContainerStarted","Data":"1ee0db7c4eb61b600615317adcb589e4f12e40f79ee4d27b28cd34cb602054e7"} Jan 31 09:20:12 crc kubenswrapper[4783]: I0131 09:20:12.390573 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerName="ceilometer-central-agent" containerID="cri-o://25f252f42507381d6f5df07d00a97adb36d64680b772b24263a37377b71669de" gracePeriod=30 Jan 31 09:20:12 crc kubenswrapper[4783]: I0131 09:20:12.390755 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerName="proxy-httpd" containerID="cri-o://1ee0db7c4eb61b600615317adcb589e4f12e40f79ee4d27b28cd34cb602054e7" gracePeriod=30 Jan 31 09:20:12 crc kubenswrapper[4783]: I0131 09:20:12.390826 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerName="ceilometer-notification-agent" containerID="cri-o://27bb4b007fad80efe3e83d6928cd74a46be48dac1fa3a5ed7437888faa0230ee" gracePeriod=30 Jan 31 09:20:12 crc kubenswrapper[4783]: I0131 09:20:12.390792 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerName="sg-core" containerID="cri-o://baacdff548290bc50606fc3d44d8f5dae37a1e17a3ec81027d7b46ba75025d85" gracePeriod=30 Jan 31 09:20:12 crc kubenswrapper[4783]: I0131 09:20:12.410447 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.5794715529999999 podStartE2EDuration="10.410434397s" podCreationTimestamp="2026-01-31 09:20:02 +0000 UTC" firstStartedPulling="2026-01-31 09:20:03.189860468 +0000 UTC m=+913.858543937" lastFinishedPulling="2026-01-31 09:20:12.020823314 +0000 UTC m=+922.689506781" observedRunningTime="2026-01-31 09:20:12.407028647 +0000 UTC m=+923.075712134" watchObservedRunningTime="2026-01-31 09:20:12.410434397 +0000 UTC m=+923.079117865" Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.400350 4783 generic.go:334] "Generic (PLEG): container finished" podID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerID="1ee0db7c4eb61b600615317adcb589e4f12e40f79ee4d27b28cd34cb602054e7" exitCode=0 Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.400597 4783 generic.go:334] "Generic (PLEG): container finished" podID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerID="baacdff548290bc50606fc3d44d8f5dae37a1e17a3ec81027d7b46ba75025d85" exitCode=2 Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.400606 4783 generic.go:334] "Generic (PLEG): container finished" podID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerID="25f252f42507381d6f5df07d00a97adb36d64680b772b24263a37377b71669de" exitCode=0 Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.400458 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12de2fc0-cdd7-4abb-8892-bded5646da4a","Type":"ContainerDied","Data":"1ee0db7c4eb61b600615317adcb589e4f12e40f79ee4d27b28cd34cb602054e7"} Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.400850 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12de2fc0-cdd7-4abb-8892-bded5646da4a","Type":"ContainerDied","Data":"baacdff548290bc50606fc3d44d8f5dae37a1e17a3ec81027d7b46ba75025d85"} Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.400875 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12de2fc0-cdd7-4abb-8892-bded5646da4a","Type":"ContainerDied","Data":"25f252f42507381d6f5df07d00a97adb36d64680b772b24263a37377b71669de"} Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.787380 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.843301 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-combined-ca-bundle\") pod \"12de2fc0-cdd7-4abb-8892-bded5646da4a\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.843356 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12de2fc0-cdd7-4abb-8892-bded5646da4a-run-httpd\") pod \"12de2fc0-cdd7-4abb-8892-bded5646da4a\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.843393 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-scripts\") pod \"12de2fc0-cdd7-4abb-8892-bded5646da4a\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.843433 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbxck\" (UniqueName: \"kubernetes.io/projected/12de2fc0-cdd7-4abb-8892-bded5646da4a-kube-api-access-fbxck\") pod \"12de2fc0-cdd7-4abb-8892-bded5646da4a\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.843461 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-config-data\") pod \"12de2fc0-cdd7-4abb-8892-bded5646da4a\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.843586 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-sg-core-conf-yaml\") pod \"12de2fc0-cdd7-4abb-8892-bded5646da4a\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.843628 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12de2fc0-cdd7-4abb-8892-bded5646da4a-log-httpd\") pod \"12de2fc0-cdd7-4abb-8892-bded5646da4a\" (UID: \"12de2fc0-cdd7-4abb-8892-bded5646da4a\") " Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.844711 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12de2fc0-cdd7-4abb-8892-bded5646da4a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "12de2fc0-cdd7-4abb-8892-bded5646da4a" (UID: "12de2fc0-cdd7-4abb-8892-bded5646da4a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.844934 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12de2fc0-cdd7-4abb-8892-bded5646da4a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "12de2fc0-cdd7-4abb-8892-bded5646da4a" (UID: "12de2fc0-cdd7-4abb-8892-bded5646da4a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.851259 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-scripts" (OuterVolumeSpecName: "scripts") pod "12de2fc0-cdd7-4abb-8892-bded5646da4a" (UID: "12de2fc0-cdd7-4abb-8892-bded5646da4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.852858 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12de2fc0-cdd7-4abb-8892-bded5646da4a-kube-api-access-fbxck" (OuterVolumeSpecName: "kube-api-access-fbxck") pod "12de2fc0-cdd7-4abb-8892-bded5646da4a" (UID: "12de2fc0-cdd7-4abb-8892-bded5646da4a"). InnerVolumeSpecName "kube-api-access-fbxck". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.867703 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "12de2fc0-cdd7-4abb-8892-bded5646da4a" (UID: "12de2fc0-cdd7-4abb-8892-bded5646da4a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.902276 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12de2fc0-cdd7-4abb-8892-bded5646da4a" (UID: "12de2fc0-cdd7-4abb-8892-bded5646da4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.912176 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-config-data" (OuterVolumeSpecName: "config-data") pod "12de2fc0-cdd7-4abb-8892-bded5646da4a" (UID: "12de2fc0-cdd7-4abb-8892-bded5646da4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.946530 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.946573 4783 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12de2fc0-cdd7-4abb-8892-bded5646da4a-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.946585 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.946596 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbxck\" (UniqueName: \"kubernetes.io/projected/12de2fc0-cdd7-4abb-8892-bded5646da4a-kube-api-access-fbxck\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.946607 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.946616 4783 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12de2fc0-cdd7-4abb-8892-bded5646da4a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:13 crc kubenswrapper[4783]: I0131 09:20:13.946625 4783 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12de2fc0-cdd7-4abb-8892-bded5646da4a-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.410609 4783 generic.go:334] "Generic (PLEG): container finished" podID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerID="27bb4b007fad80efe3e83d6928cd74a46be48dac1fa3a5ed7437888faa0230ee" exitCode=0 Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.411752 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12de2fc0-cdd7-4abb-8892-bded5646da4a","Type":"ContainerDied","Data":"27bb4b007fad80efe3e83d6928cd74a46be48dac1fa3a5ed7437888faa0230ee"} Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.411883 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12de2fc0-cdd7-4abb-8892-bded5646da4a","Type":"ContainerDied","Data":"0e8c7685b2546c42617c3430ce10d871682dccf02b7b6c1c2d13bfcce58304cc"} Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.411975 4783 scope.go:117] "RemoveContainer" containerID="1ee0db7c4eb61b600615317adcb589e4f12e40f79ee4d27b28cd34cb602054e7" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.412212 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.430445 4783 scope.go:117] "RemoveContainer" containerID="baacdff548290bc50606fc3d44d8f5dae37a1e17a3ec81027d7b46ba75025d85" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.444759 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.457200 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.467526 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:14 crc kubenswrapper[4783]: E0131 09:20:14.468179 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerName="ceilometer-central-agent" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.470116 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerName="ceilometer-central-agent" Jan 31 09:20:14 crc kubenswrapper[4783]: E0131 09:20:14.470260 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerName="proxy-httpd" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.470764 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerName="proxy-httpd" Jan 31 09:20:14 crc kubenswrapper[4783]: E0131 09:20:14.475086 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerName="ceilometer-notification-agent" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.475232 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerName="ceilometer-notification-agent" Jan 31 09:20:14 crc kubenswrapper[4783]: E0131 09:20:14.475337 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerName="sg-core" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.475413 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerName="sg-core" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.474984 4783 scope.go:117] "RemoveContainer" containerID="27bb4b007fad80efe3e83d6928cd74a46be48dac1fa3a5ed7437888faa0230ee" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.475958 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerName="sg-core" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.476183 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerName="proxy-httpd" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.476267 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerName="ceilometer-central-agent" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.476356 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" containerName="ceilometer-notification-agent" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.485965 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.486345 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.489230 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.489415 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.514068 4783 scope.go:117] "RemoveContainer" containerID="25f252f42507381d6f5df07d00a97adb36d64680b772b24263a37377b71669de" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.537754 4783 scope.go:117] "RemoveContainer" containerID="1ee0db7c4eb61b600615317adcb589e4f12e40f79ee4d27b28cd34cb602054e7" Jan 31 09:20:14 crc kubenswrapper[4783]: E0131 09:20:14.538209 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ee0db7c4eb61b600615317adcb589e4f12e40f79ee4d27b28cd34cb602054e7\": container with ID starting with 1ee0db7c4eb61b600615317adcb589e4f12e40f79ee4d27b28cd34cb602054e7 not found: ID does not exist" containerID="1ee0db7c4eb61b600615317adcb589e4f12e40f79ee4d27b28cd34cb602054e7" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.538241 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ee0db7c4eb61b600615317adcb589e4f12e40f79ee4d27b28cd34cb602054e7"} err="failed to get container status \"1ee0db7c4eb61b600615317adcb589e4f12e40f79ee4d27b28cd34cb602054e7\": rpc error: code = NotFound desc = could not find container \"1ee0db7c4eb61b600615317adcb589e4f12e40f79ee4d27b28cd34cb602054e7\": container with ID starting with 1ee0db7c4eb61b600615317adcb589e4f12e40f79ee4d27b28cd34cb602054e7 not found: ID does not exist" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.538264 4783 scope.go:117] "RemoveContainer" containerID="baacdff548290bc50606fc3d44d8f5dae37a1e17a3ec81027d7b46ba75025d85" Jan 31 09:20:14 crc kubenswrapper[4783]: E0131 09:20:14.538494 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baacdff548290bc50606fc3d44d8f5dae37a1e17a3ec81027d7b46ba75025d85\": container with ID starting with baacdff548290bc50606fc3d44d8f5dae37a1e17a3ec81027d7b46ba75025d85 not found: ID does not exist" containerID="baacdff548290bc50606fc3d44d8f5dae37a1e17a3ec81027d7b46ba75025d85" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.538521 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baacdff548290bc50606fc3d44d8f5dae37a1e17a3ec81027d7b46ba75025d85"} err="failed to get container status \"baacdff548290bc50606fc3d44d8f5dae37a1e17a3ec81027d7b46ba75025d85\": rpc error: code = NotFound desc = could not find container \"baacdff548290bc50606fc3d44d8f5dae37a1e17a3ec81027d7b46ba75025d85\": container with ID starting with baacdff548290bc50606fc3d44d8f5dae37a1e17a3ec81027d7b46ba75025d85 not found: ID does not exist" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.538534 4783 scope.go:117] "RemoveContainer" containerID="27bb4b007fad80efe3e83d6928cd74a46be48dac1fa3a5ed7437888faa0230ee" Jan 31 09:20:14 crc kubenswrapper[4783]: E0131 09:20:14.538726 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27bb4b007fad80efe3e83d6928cd74a46be48dac1fa3a5ed7437888faa0230ee\": container with ID starting with 27bb4b007fad80efe3e83d6928cd74a46be48dac1fa3a5ed7437888faa0230ee not found: ID does not exist" containerID="27bb4b007fad80efe3e83d6928cd74a46be48dac1fa3a5ed7437888faa0230ee" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.538745 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27bb4b007fad80efe3e83d6928cd74a46be48dac1fa3a5ed7437888faa0230ee"} err="failed to get container status \"27bb4b007fad80efe3e83d6928cd74a46be48dac1fa3a5ed7437888faa0230ee\": rpc error: code = NotFound desc = could not find container \"27bb4b007fad80efe3e83d6928cd74a46be48dac1fa3a5ed7437888faa0230ee\": container with ID starting with 27bb4b007fad80efe3e83d6928cd74a46be48dac1fa3a5ed7437888faa0230ee not found: ID does not exist" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.538757 4783 scope.go:117] "RemoveContainer" containerID="25f252f42507381d6f5df07d00a97adb36d64680b772b24263a37377b71669de" Jan 31 09:20:14 crc kubenswrapper[4783]: E0131 09:20:14.539316 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f252f42507381d6f5df07d00a97adb36d64680b772b24263a37377b71669de\": container with ID starting with 25f252f42507381d6f5df07d00a97adb36d64680b772b24263a37377b71669de not found: ID does not exist" containerID="25f252f42507381d6f5df07d00a97adb36d64680b772b24263a37377b71669de" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.539378 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f252f42507381d6f5df07d00a97adb36d64680b772b24263a37377b71669de"} err="failed to get container status \"25f252f42507381d6f5df07d00a97adb36d64680b772b24263a37377b71669de\": rpc error: code = NotFound desc = could not find container \"25f252f42507381d6f5df07d00a97adb36d64680b772b24263a37377b71669de\": container with ID starting with 25f252f42507381d6f5df07d00a97adb36d64680b772b24263a37377b71669de not found: ID does not exist" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.557339 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-config-data\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.557438 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-scripts\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.557482 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbkc5\" (UniqueName: \"kubernetes.io/projected/a25966a5-bd6e-458d-b002-130480d742a5-kube-api-access-mbkc5\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.557506 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.557580 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.557652 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25966a5-bd6e-458d-b002-130480d742a5-log-httpd\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.557684 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25966a5-bd6e-458d-b002-130480d742a5-run-httpd\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.659319 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbkc5\" (UniqueName: \"kubernetes.io/projected/a25966a5-bd6e-458d-b002-130480d742a5-kube-api-access-mbkc5\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.659376 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.659407 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.659441 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25966a5-bd6e-458d-b002-130480d742a5-log-httpd\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.659461 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25966a5-bd6e-458d-b002-130480d742a5-run-httpd\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.659491 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-config-data\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.659559 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-scripts\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.660359 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25966a5-bd6e-458d-b002-130480d742a5-log-httpd\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.660640 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25966a5-bd6e-458d-b002-130480d742a5-run-httpd\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.663524 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.663671 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-scripts\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.663836 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.665791 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-config-data\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.674280 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbkc5\" (UniqueName: \"kubernetes.io/projected/a25966a5-bd6e-458d-b002-130480d742a5-kube-api-access-mbkc5\") pod \"ceilometer-0\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " pod="openstack/ceilometer-0" Jan 31 09:20:14 crc kubenswrapper[4783]: I0131 09:20:14.812378 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:20:15 crc kubenswrapper[4783]: I0131 09:20:15.223875 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:15 crc kubenswrapper[4783]: W0131 09:20:15.224843 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda25966a5_bd6e_458d_b002_130480d742a5.slice/crio-6670f8aa5b9439162a0ed9c89d67d90ca1daefa0095473ede4c0b52e33e51b44 WatchSource:0}: Error finding container 6670f8aa5b9439162a0ed9c89d67d90ca1daefa0095473ede4c0b52e33e51b44: Status 404 returned error can't find the container with id 6670f8aa5b9439162a0ed9c89d67d90ca1daefa0095473ede4c0b52e33e51b44 Jan 31 09:20:15 crc kubenswrapper[4783]: I0131 09:20:15.420308 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25966a5-bd6e-458d-b002-130480d742a5","Type":"ContainerStarted","Data":"6670f8aa5b9439162a0ed9c89d67d90ca1daefa0095473ede4c0b52e33e51b44"} Jan 31 09:20:15 crc kubenswrapper[4783]: I0131 09:20:15.654248 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12de2fc0-cdd7-4abb-8892-bded5646da4a" path="/var/lib/kubelet/pods/12de2fc0-cdd7-4abb-8892-bded5646da4a/volumes" Jan 31 09:20:16 crc kubenswrapper[4783]: I0131 09:20:16.438677 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25966a5-bd6e-458d-b002-130480d742a5","Type":"ContainerStarted","Data":"ba6765c2eb8c111768ee94c4b9ea61a597f791dc47a2183ff754f80bc760c583"} Jan 31 09:20:17 crc kubenswrapper[4783]: I0131 09:20:17.449709 4783 generic.go:334] "Generic (PLEG): container finished" podID="32d39e3f-c526-4ce8-8ca2-f7fa369be6ef" containerID="fbd9072a8d3842a95774cac15afdcedf7d6510f10e98405f7df9cb200014801e" exitCode=0 Jan 31 09:20:17 crc kubenswrapper[4783]: I0131 09:20:17.449784 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cwssw" event={"ID":"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef","Type":"ContainerDied","Data":"fbd9072a8d3842a95774cac15afdcedf7d6510f10e98405f7df9cb200014801e"} Jan 31 09:20:17 crc kubenswrapper[4783]: I0131 09:20:17.454634 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25966a5-bd6e-458d-b002-130480d742a5","Type":"ContainerStarted","Data":"341353603d3543f03e4712085bad09cb28ab70bdc64398275ec4e78fbe91b753"} Jan 31 09:20:17 crc kubenswrapper[4783]: I0131 09:20:17.454665 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25966a5-bd6e-458d-b002-130480d742a5","Type":"ContainerStarted","Data":"1bc1a13abc159d9a482463a4657c2c98b0c2cff2fa7152f086f5273895cc0538"} Jan 31 09:20:17 crc kubenswrapper[4783]: I0131 09:20:17.756283 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:20:17 crc kubenswrapper[4783]: I0131 09:20:17.756351 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:20:18 crc kubenswrapper[4783]: I0131 09:20:18.797122 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cwssw" Jan 31 09:20:18 crc kubenswrapper[4783]: I0131 09:20:18.959544 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-combined-ca-bundle\") pod \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\" (UID: \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\") " Jan 31 09:20:18 crc kubenswrapper[4783]: I0131 09:20:18.959878 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xgjn\" (UniqueName: \"kubernetes.io/projected/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-kube-api-access-7xgjn\") pod \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\" (UID: \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\") " Jan 31 09:20:18 crc kubenswrapper[4783]: I0131 09:20:18.959921 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-config-data\") pod \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\" (UID: \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\") " Jan 31 09:20:18 crc kubenswrapper[4783]: I0131 09:20:18.960014 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-scripts\") pod \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\" (UID: \"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef\") " Jan 31 09:20:18 crc kubenswrapper[4783]: I0131 09:20:18.966731 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-scripts" (OuterVolumeSpecName: "scripts") pod "32d39e3f-c526-4ce8-8ca2-f7fa369be6ef" (UID: "32d39e3f-c526-4ce8-8ca2-f7fa369be6ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:18 crc kubenswrapper[4783]: I0131 09:20:18.967831 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-kube-api-access-7xgjn" (OuterVolumeSpecName: "kube-api-access-7xgjn") pod "32d39e3f-c526-4ce8-8ca2-f7fa369be6ef" (UID: "32d39e3f-c526-4ce8-8ca2-f7fa369be6ef"). InnerVolumeSpecName "kube-api-access-7xgjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:18 crc kubenswrapper[4783]: I0131 09:20:18.984796 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-config-data" (OuterVolumeSpecName: "config-data") pod "32d39e3f-c526-4ce8-8ca2-f7fa369be6ef" (UID: "32d39e3f-c526-4ce8-8ca2-f7fa369be6ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:18 crc kubenswrapper[4783]: I0131 09:20:18.986736 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32d39e3f-c526-4ce8-8ca2-f7fa369be6ef" (UID: "32d39e3f-c526-4ce8-8ca2-f7fa369be6ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.061879 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.061922 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xgjn\" (UniqueName: \"kubernetes.io/projected/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-kube-api-access-7xgjn\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.061938 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.061947 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.472263 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cwssw" event={"ID":"32d39e3f-c526-4ce8-8ca2-f7fa369be6ef","Type":"ContainerDied","Data":"da822e6422e728c880a8930284618b0ec377febdbe6f9b73ec17cba7d63b6b07"} Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.472531 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da822e6422e728c880a8930284618b0ec377febdbe6f9b73ec17cba7d63b6b07" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.472836 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cwssw" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.476300 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25966a5-bd6e-458d-b002-130480d742a5","Type":"ContainerStarted","Data":"1993a5b26e2bf319d7f5b3ea0869e4c5daa4cfea3eb347f8472fd8e4dd3fe718"} Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.477107 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.494182 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6780126929999999 podStartE2EDuration="5.494152173s" podCreationTimestamp="2026-01-31 09:20:14 +0000 UTC" firstStartedPulling="2026-01-31 09:20:15.226982838 +0000 UTC m=+925.895666307" lastFinishedPulling="2026-01-31 09:20:19.043122319 +0000 UTC m=+929.711805787" observedRunningTime="2026-01-31 09:20:19.492004494 +0000 UTC m=+930.160687963" watchObservedRunningTime="2026-01-31 09:20:19.494152173 +0000 UTC m=+930.162835642" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.558699 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 31 09:20:19 crc kubenswrapper[4783]: E0131 09:20:19.559185 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d39e3f-c526-4ce8-8ca2-f7fa369be6ef" containerName="nova-cell0-conductor-db-sync" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.559205 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d39e3f-c526-4ce8-8ca2-f7fa369be6ef" containerName="nova-cell0-conductor-db-sync" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.559412 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d39e3f-c526-4ce8-8ca2-f7fa369be6ef" containerName="nova-cell0-conductor-db-sync" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.560046 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.570542 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kv5l5" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.576408 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.582022 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.678047 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.678120 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.678339 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg8q6\" (UniqueName: \"kubernetes.io/projected/234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7-kube-api-access-kg8q6\") pod \"nova-cell0-conductor-0\" (UID: \"234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.780634 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg8q6\" (UniqueName: \"kubernetes.io/projected/234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7-kube-api-access-kg8q6\") pod \"nova-cell0-conductor-0\" (UID: \"234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.780759 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.780829 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.786571 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.788728 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.795840 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg8q6\" (UniqueName: \"kubernetes.io/projected/234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7-kube-api-access-kg8q6\") pod \"nova-cell0-conductor-0\" (UID: \"234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:20:19 crc kubenswrapper[4783]: I0131 09:20:19.886021 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 31 09:20:20 crc kubenswrapper[4783]: I0131 09:20:20.342370 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 31 09:20:20 crc kubenswrapper[4783]: W0131 09:20:20.344128 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod234cb04e_3bcc_47d4_95ef_16eb1c8ad3c7.slice/crio-0114e6b1cf08a13a9c600cd25eff50b6ebe054f8392a8dfcfa964be886724188 WatchSource:0}: Error finding container 0114e6b1cf08a13a9c600cd25eff50b6ebe054f8392a8dfcfa964be886724188: Status 404 returned error can't find the container with id 0114e6b1cf08a13a9c600cd25eff50b6ebe054f8392a8dfcfa964be886724188 Jan 31 09:20:20 crc kubenswrapper[4783]: I0131 09:20:20.498535 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7","Type":"ContainerStarted","Data":"0114e6b1cf08a13a9c600cd25eff50b6ebe054f8392a8dfcfa964be886724188"} Jan 31 09:20:21 crc kubenswrapper[4783]: I0131 09:20:21.506889 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7","Type":"ContainerStarted","Data":"7c2839df0aa95086e1a3c0f28df209d12896491d55ef7b7095c9a4097e648149"} Jan 31 09:20:21 crc kubenswrapper[4783]: I0131 09:20:21.507369 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 31 09:20:21 crc kubenswrapper[4783]: I0131 09:20:21.528123 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.528111511 podStartE2EDuration="2.528111511s" podCreationTimestamp="2026-01-31 09:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:21.522898885 +0000 UTC m=+932.191582353" watchObservedRunningTime="2026-01-31 09:20:21.528111511 +0000 UTC m=+932.196794979" Jan 31 09:20:29 crc kubenswrapper[4783]: I0131 09:20:29.914731 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.346092 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-sc4zh"] Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.347429 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sc4zh" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.349684 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.351297 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.354083 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-sc4zh"] Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.493191 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.494968 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.497040 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.499806 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-scripts\") pod \"nova-cell0-cell-mapping-sc4zh\" (UID: \"081ac503-fb08-4990-9019-be9c35167de3\") " pod="openstack/nova-cell0-cell-mapping-sc4zh" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.499851 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-config-data\") pod \"nova-cell0-cell-mapping-sc4zh\" (UID: \"081ac503-fb08-4990-9019-be9c35167de3\") " pod="openstack/nova-cell0-cell-mapping-sc4zh" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.501313 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sc4zh\" (UID: \"081ac503-fb08-4990-9019-be9c35167de3\") " pod="openstack/nova-cell0-cell-mapping-sc4zh" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.501349 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfgdt\" (UniqueName: \"kubernetes.io/projected/081ac503-fb08-4990-9019-be9c35167de3-kube-api-access-hfgdt\") pod \"nova-cell0-cell-mapping-sc4zh\" (UID: \"081ac503-fb08-4990-9019-be9c35167de3\") " pod="openstack/nova-cell0-cell-mapping-sc4zh" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.510225 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.511409 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.535900 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.604487 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.613263 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sc4zh\" (UID: \"081ac503-fb08-4990-9019-be9c35167de3\") " pod="openstack/nova-cell0-cell-mapping-sc4zh" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.613314 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbacb461-6b27-45c8-9613-0c58669c2d8e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fbacb461-6b27-45c8-9613-0c58669c2d8e\") " pod="openstack/nova-api-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.613340 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfgdt\" (UniqueName: \"kubernetes.io/projected/081ac503-fb08-4990-9019-be9c35167de3-kube-api-access-hfgdt\") pod \"nova-cell0-cell-mapping-sc4zh\" (UID: \"081ac503-fb08-4990-9019-be9c35167de3\") " pod="openstack/nova-cell0-cell-mapping-sc4zh" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.613379 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-scripts\") pod \"nova-cell0-cell-mapping-sc4zh\" (UID: \"081ac503-fb08-4990-9019-be9c35167de3\") " pod="openstack/nova-cell0-cell-mapping-sc4zh" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.613403 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-config-data\") pod \"nova-cell0-cell-mapping-sc4zh\" (UID: \"081ac503-fb08-4990-9019-be9c35167de3\") " pod="openstack/nova-cell0-cell-mapping-sc4zh" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.613427 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9fdd\" (UniqueName: \"kubernetes.io/projected/fbacb461-6b27-45c8-9613-0c58669c2d8e-kube-api-access-t9fdd\") pod \"nova-api-0\" (UID: \"fbacb461-6b27-45c8-9613-0c58669c2d8e\") " pod="openstack/nova-api-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.613467 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbacb461-6b27-45c8-9613-0c58669c2d8e-logs\") pod \"nova-api-0\" (UID: \"fbacb461-6b27-45c8-9613-0c58669c2d8e\") " pod="openstack/nova-api-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.613496 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbacb461-6b27-45c8-9613-0c58669c2d8e-config-data\") pod \"nova-api-0\" (UID: \"fbacb461-6b27-45c8-9613-0c58669c2d8e\") " pod="openstack/nova-api-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.642863 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.655030 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfgdt\" (UniqueName: \"kubernetes.io/projected/081ac503-fb08-4990-9019-be9c35167de3-kube-api-access-hfgdt\") pod \"nova-cell0-cell-mapping-sc4zh\" (UID: \"081ac503-fb08-4990-9019-be9c35167de3\") " pod="openstack/nova-cell0-cell-mapping-sc4zh" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.655500 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-config-data\") pod \"nova-cell0-cell-mapping-sc4zh\" (UID: \"081ac503-fb08-4990-9019-be9c35167de3\") " pod="openstack/nova-cell0-cell-mapping-sc4zh" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.660820 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sc4zh\" (UID: \"081ac503-fb08-4990-9019-be9c35167de3\") " pod="openstack/nova-cell0-cell-mapping-sc4zh" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.661273 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-scripts\") pod \"nova-cell0-cell-mapping-sc4zh\" (UID: \"081ac503-fb08-4990-9019-be9c35167de3\") " pod="openstack/nova-cell0-cell-mapping-sc4zh" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.678973 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sc4zh" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.717784 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.719130 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.721789 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce903588-41a7-4443-81a3-3d2db239c3a5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce903588-41a7-4443-81a3-3d2db239c3a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.722023 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce903588-41a7-4443-81a3-3d2db239c3a5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce903588-41a7-4443-81a3-3d2db239c3a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.722135 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbacb461-6b27-45c8-9613-0c58669c2d8e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fbacb461-6b27-45c8-9613-0c58669c2d8e\") " pod="openstack/nova-api-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.722269 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7hdv\" (UniqueName: \"kubernetes.io/projected/ce903588-41a7-4443-81a3-3d2db239c3a5-kube-api-access-l7hdv\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce903588-41a7-4443-81a3-3d2db239c3a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.723383 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9fdd\" (UniqueName: \"kubernetes.io/projected/fbacb461-6b27-45c8-9613-0c58669c2d8e-kube-api-access-t9fdd\") pod \"nova-api-0\" (UID: \"fbacb461-6b27-45c8-9613-0c58669c2d8e\") " pod="openstack/nova-api-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.723590 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbacb461-6b27-45c8-9613-0c58669c2d8e-logs\") pod \"nova-api-0\" (UID: \"fbacb461-6b27-45c8-9613-0c58669c2d8e\") " pod="openstack/nova-api-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.723711 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbacb461-6b27-45c8-9613-0c58669c2d8e-config-data\") pod \"nova-api-0\" (UID: \"fbacb461-6b27-45c8-9613-0c58669c2d8e\") " pod="openstack/nova-api-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.724280 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.724959 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbacb461-6b27-45c8-9613-0c58669c2d8e-logs\") pod \"nova-api-0\" (UID: \"fbacb461-6b27-45c8-9613-0c58669c2d8e\") " pod="openstack/nova-api-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.743887 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.754687 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbacb461-6b27-45c8-9613-0c58669c2d8e-config-data\") pod \"nova-api-0\" (UID: \"fbacb461-6b27-45c8-9613-0c58669c2d8e\") " pod="openstack/nova-api-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.755970 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbacb461-6b27-45c8-9613-0c58669c2d8e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fbacb461-6b27-45c8-9613-0c58669c2d8e\") " pod="openstack/nova-api-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.756446 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9fdd\" (UniqueName: \"kubernetes.io/projected/fbacb461-6b27-45c8-9613-0c58669c2d8e-kube-api-access-t9fdd\") pod \"nova-api-0\" (UID: \"fbacb461-6b27-45c8-9613-0c58669c2d8e\") " pod="openstack/nova-api-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.763210 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.764822 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.767658 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.770040 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.776761 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9ljpj"] Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.780427 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.786968 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9ljpj"] Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.825793 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.825830 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.825872 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-config-data\") pod \"nova-scheduler-0\" (UID: \"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.825921 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee339729-83ff-4126-8594-4ad2894aca05-config-data\") pod \"nova-metadata-0\" (UID: \"ee339729-83ff-4126-8594-4ad2894aca05\") " pod="openstack/nova-metadata-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.825948 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce903588-41a7-4443-81a3-3d2db239c3a5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce903588-41a7-4443-81a3-3d2db239c3a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.825965 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.825994 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.826022 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhr7x\" (UniqueName: \"kubernetes.io/projected/0d965f76-ea10-4246-aeb2-014ba9f3fd65-kube-api-access-hhr7x\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.826045 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.826098 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce903588-41a7-4443-81a3-3d2db239c3a5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce903588-41a7-4443-81a3-3d2db239c3a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.826123 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brks5\" (UniqueName: \"kubernetes.io/projected/ee339729-83ff-4126-8594-4ad2894aca05-kube-api-access-brks5\") pod \"nova-metadata-0\" (UID: \"ee339729-83ff-4126-8594-4ad2894aca05\") " pod="openstack/nova-metadata-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.826143 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee339729-83ff-4126-8594-4ad2894aca05-logs\") pod \"nova-metadata-0\" (UID: \"ee339729-83ff-4126-8594-4ad2894aca05\") " pod="openstack/nova-metadata-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.826182 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7hdv\" (UniqueName: \"kubernetes.io/projected/ce903588-41a7-4443-81a3-3d2db239c3a5-kube-api-access-l7hdv\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce903588-41a7-4443-81a3-3d2db239c3a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.826227 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee339729-83ff-4126-8594-4ad2894aca05-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee339729-83ff-4126-8594-4ad2894aca05\") " pod="openstack/nova-metadata-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.826254 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7kx5\" (UniqueName: \"kubernetes.io/projected/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-kube-api-access-m7kx5\") pod \"nova-scheduler-0\" (UID: \"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.826271 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-config\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.830042 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce903588-41a7-4443-81a3-3d2db239c3a5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce903588-41a7-4443-81a3-3d2db239c3a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.830645 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.831570 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce903588-41a7-4443-81a3-3d2db239c3a5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce903588-41a7-4443-81a3-3d2db239c3a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.841517 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7hdv\" (UniqueName: \"kubernetes.io/projected/ce903588-41a7-4443-81a3-3d2db239c3a5-kube-api-access-l7hdv\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce903588-41a7-4443-81a3-3d2db239c3a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.844480 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.928940 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee339729-83ff-4126-8594-4ad2894aca05-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee339729-83ff-4126-8594-4ad2894aca05\") " pod="openstack/nova-metadata-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.928981 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7kx5\" (UniqueName: \"kubernetes.io/projected/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-kube-api-access-m7kx5\") pod \"nova-scheduler-0\" (UID: \"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.929003 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-config\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.929049 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.929070 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.929103 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-config-data\") pod \"nova-scheduler-0\" (UID: \"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.929125 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee339729-83ff-4126-8594-4ad2894aca05-config-data\") pod \"nova-metadata-0\" (UID: \"ee339729-83ff-4126-8594-4ad2894aca05\") " pod="openstack/nova-metadata-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.929146 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.930123 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.930855 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.930896 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhr7x\" (UniqueName: \"kubernetes.io/projected/0d965f76-ea10-4246-aeb2-014ba9f3fd65-kube-api-access-hhr7x\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.930917 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.930939 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brks5\" (UniqueName: \"kubernetes.io/projected/ee339729-83ff-4126-8594-4ad2894aca05-kube-api-access-brks5\") pod \"nova-metadata-0\" (UID: \"ee339729-83ff-4126-8594-4ad2894aca05\") " pod="openstack/nova-metadata-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.930958 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee339729-83ff-4126-8594-4ad2894aca05-logs\") pod \"nova-metadata-0\" (UID: \"ee339729-83ff-4126-8594-4ad2894aca05\") " pod="openstack/nova-metadata-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.931333 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee339729-83ff-4126-8594-4ad2894aca05-logs\") pod \"nova-metadata-0\" (UID: \"ee339729-83ff-4126-8594-4ad2894aca05\") " pod="openstack/nova-metadata-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.933486 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.933575 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-config-data\") pod \"nova-scheduler-0\" (UID: \"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.933995 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.934044 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.934583 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-config\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.936860 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.937237 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee339729-83ff-4126-8594-4ad2894aca05-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee339729-83ff-4126-8594-4ad2894aca05\") " pod="openstack/nova-metadata-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.945175 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee339729-83ff-4126-8594-4ad2894aca05-config-data\") pod \"nova-metadata-0\" (UID: \"ee339729-83ff-4126-8594-4ad2894aca05\") " pod="openstack/nova-metadata-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.952053 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhr7x\" (UniqueName: \"kubernetes.io/projected/0d965f76-ea10-4246-aeb2-014ba9f3fd65-kube-api-access-hhr7x\") pod \"dnsmasq-dns-557bbc7df7-9ljpj\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.953740 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brks5\" (UniqueName: \"kubernetes.io/projected/ee339729-83ff-4126-8594-4ad2894aca05-kube-api-access-brks5\") pod \"nova-metadata-0\" (UID: \"ee339729-83ff-4126-8594-4ad2894aca05\") " pod="openstack/nova-metadata-0" Jan 31 09:20:30 crc kubenswrapper[4783]: I0131 09:20:30.955959 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7kx5\" (UniqueName: \"kubernetes.io/projected/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-kube-api-access-m7kx5\") pod \"nova-scheduler-0\" (UID: \"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.132609 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.140748 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.147471 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.260543 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.346683 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-sc4zh"] Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.470612 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-h95z6"] Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.471797 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-h95z6" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.473896 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.475490 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.479552 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-h95z6"] Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.543341 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.548286 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-h95z6\" (UID: \"7fbba53b-4460-44f1-9745-1c627089c168\") " pod="openstack/nova-cell1-conductor-db-sync-h95z6" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.548355 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m4gq\" (UniqueName: \"kubernetes.io/projected/7fbba53b-4460-44f1-9745-1c627089c168-kube-api-access-6m4gq\") pod \"nova-cell1-conductor-db-sync-h95z6\" (UID: \"7fbba53b-4460-44f1-9745-1c627089c168\") " pod="openstack/nova-cell1-conductor-db-sync-h95z6" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.548613 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-scripts\") pod \"nova-cell1-conductor-db-sync-h95z6\" (UID: \"7fbba53b-4460-44f1-9745-1c627089c168\") " pod="openstack/nova-cell1-conductor-db-sync-h95z6" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.548664 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-config-data\") pod \"nova-cell1-conductor-db-sync-h95z6\" (UID: \"7fbba53b-4460-44f1-9745-1c627089c168\") " pod="openstack/nova-cell1-conductor-db-sync-h95z6" Jan 31 09:20:31 crc kubenswrapper[4783]: W0131 09:20:31.560685 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce903588_41a7_4443_81a3_3d2db239c3a5.slice/crio-aa2981056306d7e5e079b0743bc8c03db85b5ff6b068b0e428a7ebcfdd056961 WatchSource:0}: Error finding container aa2981056306d7e5e079b0743bc8c03db85b5ff6b068b0e428a7ebcfdd056961: Status 404 returned error can't find the container with id aa2981056306d7e5e079b0743bc8c03db85b5ff6b068b0e428a7ebcfdd056961 Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.620832 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sc4zh" event={"ID":"081ac503-fb08-4990-9019-be9c35167de3","Type":"ContainerStarted","Data":"c3d917f33cf7e46bcc19a946b53c372a9598023906e4501825aed919076c9be7"} Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.620880 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sc4zh" event={"ID":"081ac503-fb08-4990-9019-be9c35167de3","Type":"ContainerStarted","Data":"842a785bf28830ed56adb6606e159f25b20f6724a39b8cdf63c5e97f36f2aae3"} Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.638655 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbacb461-6b27-45c8-9613-0c58669c2d8e","Type":"ContainerStarted","Data":"098758239ab3132c80a3b5b065fc04c62dfe18c053712b6e4b6d6bd648387ab7"} Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.640391 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ce903588-41a7-4443-81a3-3d2db239c3a5","Type":"ContainerStarted","Data":"aa2981056306d7e5e079b0743bc8c03db85b5ff6b068b0e428a7ebcfdd056961"} Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.650057 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-h95z6\" (UID: \"7fbba53b-4460-44f1-9745-1c627089c168\") " pod="openstack/nova-cell1-conductor-db-sync-h95z6" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.650094 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m4gq\" (UniqueName: \"kubernetes.io/projected/7fbba53b-4460-44f1-9745-1c627089c168-kube-api-access-6m4gq\") pod \"nova-cell1-conductor-db-sync-h95z6\" (UID: \"7fbba53b-4460-44f1-9745-1c627089c168\") " pod="openstack/nova-cell1-conductor-db-sync-h95z6" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.650238 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-scripts\") pod \"nova-cell1-conductor-db-sync-h95z6\" (UID: \"7fbba53b-4460-44f1-9745-1c627089c168\") " pod="openstack/nova-cell1-conductor-db-sync-h95z6" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.650261 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-config-data\") pod \"nova-cell1-conductor-db-sync-h95z6\" (UID: \"7fbba53b-4460-44f1-9745-1c627089c168\") " pod="openstack/nova-cell1-conductor-db-sync-h95z6" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.655674 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-h95z6\" (UID: \"7fbba53b-4460-44f1-9745-1c627089c168\") " pod="openstack/nova-cell1-conductor-db-sync-h95z6" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.656362 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-sc4zh" podStartSLOduration=1.6563452760000001 podStartE2EDuration="1.656345276s" podCreationTimestamp="2026-01-31 09:20:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:31.647926144 +0000 UTC m=+942.316609612" watchObservedRunningTime="2026-01-31 09:20:31.656345276 +0000 UTC m=+942.325028744" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.656549 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-config-data\") pod \"nova-cell1-conductor-db-sync-h95z6\" (UID: \"7fbba53b-4460-44f1-9745-1c627089c168\") " pod="openstack/nova-cell1-conductor-db-sync-h95z6" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.662273 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-scripts\") pod \"nova-cell1-conductor-db-sync-h95z6\" (UID: \"7fbba53b-4460-44f1-9745-1c627089c168\") " pod="openstack/nova-cell1-conductor-db-sync-h95z6" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.667336 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m4gq\" (UniqueName: \"kubernetes.io/projected/7fbba53b-4460-44f1-9745-1c627089c168-kube-api-access-6m4gq\") pod \"nova-cell1-conductor-db-sync-h95z6\" (UID: \"7fbba53b-4460-44f1-9745-1c627089c168\") " pod="openstack/nova-cell1-conductor-db-sync-h95z6" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.802852 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-h95z6" Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.810005 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9ljpj"] Jan 31 09:20:31 crc kubenswrapper[4783]: W0131 09:20:31.825374 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d965f76_ea10_4246_aeb2_014ba9f3fd65.slice/crio-f9aaa7f823ec5a17abfa5a323cdf131d1fdbbe3fae48cf110ffdb6f2d6ac7aac WatchSource:0}: Error finding container f9aaa7f823ec5a17abfa5a323cdf131d1fdbbe3fae48cf110ffdb6f2d6ac7aac: Status 404 returned error can't find the container with id f9aaa7f823ec5a17abfa5a323cdf131d1fdbbe3fae48cf110ffdb6f2d6ac7aac Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.892940 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:20:31 crc kubenswrapper[4783]: I0131 09:20:31.904425 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:20:32 crc kubenswrapper[4783]: I0131 09:20:32.249803 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-h95z6"] Jan 31 09:20:32 crc kubenswrapper[4783]: W0131 09:20:32.252246 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fbba53b_4460_44f1_9745_1c627089c168.slice/crio-c1e6f99980990de328b75b5df60905453db4508da7a20a38920c87972156d049 WatchSource:0}: Error finding container c1e6f99980990de328b75b5df60905453db4508da7a20a38920c87972156d049: Status 404 returned error can't find the container with id c1e6f99980990de328b75b5df60905453db4508da7a20a38920c87972156d049 Jan 31 09:20:32 crc kubenswrapper[4783]: I0131 09:20:32.678050 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee339729-83ff-4126-8594-4ad2894aca05","Type":"ContainerStarted","Data":"103cfdd0e4039f0a397367979358da96445a294837f95596de04cc415607f336"} Jan 31 09:20:32 crc kubenswrapper[4783]: I0131 09:20:32.681205 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c","Type":"ContainerStarted","Data":"bcce2912a72f402cc758abd63abefa51d8fc82724256b0ac570ffee6de52901f"} Jan 31 09:20:32 crc kubenswrapper[4783]: I0131 09:20:32.682900 4783 generic.go:334] "Generic (PLEG): container finished" podID="0d965f76-ea10-4246-aeb2-014ba9f3fd65" containerID="91e93a6173cd47706391b6eca51d68985b99374cf8bf44c9e672c15487954ccc" exitCode=0 Jan 31 09:20:32 crc kubenswrapper[4783]: I0131 09:20:32.682945 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" event={"ID":"0d965f76-ea10-4246-aeb2-014ba9f3fd65","Type":"ContainerDied","Data":"91e93a6173cd47706391b6eca51d68985b99374cf8bf44c9e672c15487954ccc"} Jan 31 09:20:32 crc kubenswrapper[4783]: I0131 09:20:32.682961 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" event={"ID":"0d965f76-ea10-4246-aeb2-014ba9f3fd65","Type":"ContainerStarted","Data":"f9aaa7f823ec5a17abfa5a323cdf131d1fdbbe3fae48cf110ffdb6f2d6ac7aac"} Jan 31 09:20:32 crc kubenswrapper[4783]: I0131 09:20:32.687440 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-h95z6" event={"ID":"7fbba53b-4460-44f1-9745-1c627089c168","Type":"ContainerStarted","Data":"fb61b6f9b06124081a24706bc08481bfbf889a8936a74032b6cafee05c43aa65"} Jan 31 09:20:32 crc kubenswrapper[4783]: I0131 09:20:32.687468 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-h95z6" event={"ID":"7fbba53b-4460-44f1-9745-1c627089c168","Type":"ContainerStarted","Data":"c1e6f99980990de328b75b5df60905453db4508da7a20a38920c87972156d049"} Jan 31 09:20:32 crc kubenswrapper[4783]: I0131 09:20:32.723490 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-h95z6" podStartSLOduration=1.723441721 podStartE2EDuration="1.723441721s" podCreationTimestamp="2026-01-31 09:20:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:32.716866625 +0000 UTC m=+943.385550093" watchObservedRunningTime="2026-01-31 09:20:32.723441721 +0000 UTC m=+943.392125189" Jan 31 09:20:34 crc kubenswrapper[4783]: I0131 09:20:34.103047 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:20:34 crc kubenswrapper[4783]: I0131 09:20:34.134853 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:20:34 crc kubenswrapper[4783]: I0131 09:20:34.730020 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee339729-83ff-4126-8594-4ad2894aca05","Type":"ContainerStarted","Data":"9095f7d2e273de689d0f47b0748af8dcaba7aca0af9c498e87cb61f1be42a4e4"} Jan 31 09:20:34 crc kubenswrapper[4783]: I0131 09:20:34.730284 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee339729-83ff-4126-8594-4ad2894aca05","Type":"ContainerStarted","Data":"f8398da06cfb5ed935a264626e50ceae9fe2e4e1cc415cf55ad76b9d5d64ca04"} Jan 31 09:20:34 crc kubenswrapper[4783]: I0131 09:20:34.730275 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ee339729-83ff-4126-8594-4ad2894aca05" containerName="nova-metadata-log" containerID="cri-o://f8398da06cfb5ed935a264626e50ceae9fe2e4e1cc415cf55ad76b9d5d64ca04" gracePeriod=30 Jan 31 09:20:34 crc kubenswrapper[4783]: I0131 09:20:34.730361 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ee339729-83ff-4126-8594-4ad2894aca05" containerName="nova-metadata-metadata" containerID="cri-o://9095f7d2e273de689d0f47b0748af8dcaba7aca0af9c498e87cb61f1be42a4e4" gracePeriod=30 Jan 31 09:20:34 crc kubenswrapper[4783]: I0131 09:20:34.733470 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbacb461-6b27-45c8-9613-0c58669c2d8e","Type":"ContainerStarted","Data":"596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4"} Jan 31 09:20:34 crc kubenswrapper[4783]: I0131 09:20:34.733500 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbacb461-6b27-45c8-9613-0c58669c2d8e","Type":"ContainerStarted","Data":"bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b"} Jan 31 09:20:34 crc kubenswrapper[4783]: I0131 09:20:34.736984 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" event={"ID":"0d965f76-ea10-4246-aeb2-014ba9f3fd65","Type":"ContainerStarted","Data":"f88efe5e65c571ca614a8a301b5858cc10d34427e68154841a2b7dbc3eca84ae"} Jan 31 09:20:34 crc kubenswrapper[4783]: I0131 09:20:34.737113 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:34 crc kubenswrapper[4783]: I0131 09:20:34.742805 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ce903588-41a7-4443-81a3-3d2db239c3a5","Type":"ContainerStarted","Data":"20836eb91825f0846a079e37cb950df18909d02be34837774763ac2573cf1c6c"} Jan 31 09:20:34 crc kubenswrapper[4783]: I0131 09:20:34.742922 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="ce903588-41a7-4443-81a3-3d2db239c3a5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://20836eb91825f0846a079e37cb950df18909d02be34837774763ac2573cf1c6c" gracePeriod=30 Jan 31 09:20:34 crc kubenswrapper[4783]: I0131 09:20:34.755132 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.728406188 podStartE2EDuration="4.755120287s" podCreationTimestamp="2026-01-31 09:20:30 +0000 UTC" firstStartedPulling="2026-01-31 09:20:31.92732234 +0000 UTC m=+942.596005808" lastFinishedPulling="2026-01-31 09:20:33.954036429 +0000 UTC m=+944.622719907" observedRunningTime="2026-01-31 09:20:34.749271312 +0000 UTC m=+945.417954780" watchObservedRunningTime="2026-01-31 09:20:34.755120287 +0000 UTC m=+945.423803756" Jan 31 09:20:34 crc kubenswrapper[4783]: I0131 09:20:34.774560 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.386183456 podStartE2EDuration="4.774547239s" podCreationTimestamp="2026-01-31 09:20:30 +0000 UTC" firstStartedPulling="2026-01-31 09:20:31.566393456 +0000 UTC m=+942.235076923" lastFinishedPulling="2026-01-31 09:20:33.954757238 +0000 UTC m=+944.623440706" observedRunningTime="2026-01-31 09:20:34.769197495 +0000 UTC m=+945.437880964" watchObservedRunningTime="2026-01-31 09:20:34.774547239 +0000 UTC m=+945.443230708" Jan 31 09:20:34 crc kubenswrapper[4783]: I0131 09:20:34.793880 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" podStartSLOduration=4.793865637 podStartE2EDuration="4.793865637s" podCreationTimestamp="2026-01-31 09:20:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:34.792576778 +0000 UTC m=+945.461260246" watchObservedRunningTime="2026-01-31 09:20:34.793865637 +0000 UTC m=+945.462549095" Jan 31 09:20:34 crc kubenswrapper[4783]: I0131 09:20:34.813208 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.164111658 podStartE2EDuration="4.813194075s" podCreationTimestamp="2026-01-31 09:20:30 +0000 UTC" firstStartedPulling="2026-01-31 09:20:31.306346455 +0000 UTC m=+941.975029924" lastFinishedPulling="2026-01-31 09:20:33.955428874 +0000 UTC m=+944.624112341" observedRunningTime="2026-01-31 09:20:34.802274499 +0000 UTC m=+945.470957967" watchObservedRunningTime="2026-01-31 09:20:34.813194075 +0000 UTC m=+945.481877543" Jan 31 09:20:35 crc kubenswrapper[4783]: I0131 09:20:35.750437 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c","Type":"ContainerStarted","Data":"f9cc0fc539c2df63ce73fd8f4eb6154c235244ce76fe64f16420864160c5bb5f"} Jan 31 09:20:35 crc kubenswrapper[4783]: I0131 09:20:35.752534 4783 generic.go:334] "Generic (PLEG): container finished" podID="ee339729-83ff-4126-8594-4ad2894aca05" containerID="f8398da06cfb5ed935a264626e50ceae9fe2e4e1cc415cf55ad76b9d5d64ca04" exitCode=143 Jan 31 09:20:35 crc kubenswrapper[4783]: I0131 09:20:35.752614 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee339729-83ff-4126-8594-4ad2894aca05","Type":"ContainerDied","Data":"f8398da06cfb5ed935a264626e50ceae9fe2e4e1cc415cf55ad76b9d5d64ca04"} Jan 31 09:20:35 crc kubenswrapper[4783]: I0131 09:20:35.764998 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.619686089 podStartE2EDuration="5.764982359s" podCreationTimestamp="2026-01-31 09:20:30 +0000 UTC" firstStartedPulling="2026-01-31 09:20:31.93035696 +0000 UTC m=+942.599040428" lastFinishedPulling="2026-01-31 09:20:35.075653231 +0000 UTC m=+945.744336698" observedRunningTime="2026-01-31 09:20:35.762889614 +0000 UTC m=+946.431573082" watchObservedRunningTime="2026-01-31 09:20:35.764982359 +0000 UTC m=+946.433665827" Jan 31 09:20:35 crc kubenswrapper[4783]: I0131 09:20:35.845049 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:20:36 crc kubenswrapper[4783]: I0131 09:20:36.133132 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 31 09:20:36 crc kubenswrapper[4783]: I0131 09:20:36.141331 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 31 09:20:36 crc kubenswrapper[4783]: I0131 09:20:36.141382 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 31 09:20:36 crc kubenswrapper[4783]: I0131 09:20:36.765786 4783 generic.go:334] "Generic (PLEG): container finished" podID="7fbba53b-4460-44f1-9745-1c627089c168" containerID="fb61b6f9b06124081a24706bc08481bfbf889a8936a74032b6cafee05c43aa65" exitCode=0 Jan 31 09:20:36 crc kubenswrapper[4783]: I0131 09:20:36.765884 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-h95z6" event={"ID":"7fbba53b-4460-44f1-9745-1c627089c168","Type":"ContainerDied","Data":"fb61b6f9b06124081a24706bc08481bfbf889a8936a74032b6cafee05c43aa65"} Jan 31 09:20:37 crc kubenswrapper[4783]: I0131 09:20:37.780877 4783 generic.go:334] "Generic (PLEG): container finished" podID="081ac503-fb08-4990-9019-be9c35167de3" containerID="c3d917f33cf7e46bcc19a946b53c372a9598023906e4501825aed919076c9be7" exitCode=0 Jan 31 09:20:37 crc kubenswrapper[4783]: I0131 09:20:37.780977 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sc4zh" event={"ID":"081ac503-fb08-4990-9019-be9c35167de3","Type":"ContainerDied","Data":"c3d917f33cf7e46bcc19a946b53c372a9598023906e4501825aed919076c9be7"} Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.105272 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-h95z6" Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.302460 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-scripts\") pod \"7fbba53b-4460-44f1-9745-1c627089c168\" (UID: \"7fbba53b-4460-44f1-9745-1c627089c168\") " Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.302585 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-combined-ca-bundle\") pod \"7fbba53b-4460-44f1-9745-1c627089c168\" (UID: \"7fbba53b-4460-44f1-9745-1c627089c168\") " Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.302646 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m4gq\" (UniqueName: \"kubernetes.io/projected/7fbba53b-4460-44f1-9745-1c627089c168-kube-api-access-6m4gq\") pod \"7fbba53b-4460-44f1-9745-1c627089c168\" (UID: \"7fbba53b-4460-44f1-9745-1c627089c168\") " Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.302855 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-config-data\") pod \"7fbba53b-4460-44f1-9745-1c627089c168\" (UID: \"7fbba53b-4460-44f1-9745-1c627089c168\") " Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.310375 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-scripts" (OuterVolumeSpecName: "scripts") pod "7fbba53b-4460-44f1-9745-1c627089c168" (UID: "7fbba53b-4460-44f1-9745-1c627089c168"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.310402 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fbba53b-4460-44f1-9745-1c627089c168-kube-api-access-6m4gq" (OuterVolumeSpecName: "kube-api-access-6m4gq") pod "7fbba53b-4460-44f1-9745-1c627089c168" (UID: "7fbba53b-4460-44f1-9745-1c627089c168"). InnerVolumeSpecName "kube-api-access-6m4gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.331526 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-config-data" (OuterVolumeSpecName: "config-data") pod "7fbba53b-4460-44f1-9745-1c627089c168" (UID: "7fbba53b-4460-44f1-9745-1c627089c168"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.333432 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fbba53b-4460-44f1-9745-1c627089c168" (UID: "7fbba53b-4460-44f1-9745-1c627089c168"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.405877 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.406003 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.406078 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m4gq\" (UniqueName: \"kubernetes.io/projected/7fbba53b-4460-44f1-9745-1c627089c168-kube-api-access-6m4gq\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.406147 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fbba53b-4460-44f1-9745-1c627089c168-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.793787 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-h95z6" Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.793783 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-h95z6" event={"ID":"7fbba53b-4460-44f1-9745-1c627089c168","Type":"ContainerDied","Data":"c1e6f99980990de328b75b5df60905453db4508da7a20a38920c87972156d049"} Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.793854 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1e6f99980990de328b75b5df60905453db4508da7a20a38920c87972156d049" Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.866775 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 31 09:20:38 crc kubenswrapper[4783]: E0131 09:20:38.867285 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fbba53b-4460-44f1-9745-1c627089c168" containerName="nova-cell1-conductor-db-sync" Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.867304 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fbba53b-4460-44f1-9745-1c627089c168" containerName="nova-cell1-conductor-db-sync" Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.867514 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fbba53b-4460-44f1-9745-1c627089c168" containerName="nova-cell1-conductor-db-sync" Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.868281 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.870352 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 31 09:20:38 crc kubenswrapper[4783]: I0131 09:20:38.876793 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.026724 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d42e544-b623-439f-b4a8-9ee7cb72386c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5d42e544-b623-439f-b4a8-9ee7cb72386c\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.026940 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d42e544-b623-439f-b4a8-9ee7cb72386c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5d42e544-b623-439f-b4a8-9ee7cb72386c\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.027281 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqpf4\" (UniqueName: \"kubernetes.io/projected/5d42e544-b623-439f-b4a8-9ee7cb72386c-kube-api-access-jqpf4\") pod \"nova-cell1-conductor-0\" (UID: \"5d42e544-b623-439f-b4a8-9ee7cb72386c\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.129851 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqpf4\" (UniqueName: \"kubernetes.io/projected/5d42e544-b623-439f-b4a8-9ee7cb72386c-kube-api-access-jqpf4\") pod \"nova-cell1-conductor-0\" (UID: \"5d42e544-b623-439f-b4a8-9ee7cb72386c\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.130330 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d42e544-b623-439f-b4a8-9ee7cb72386c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5d42e544-b623-439f-b4a8-9ee7cb72386c\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.130389 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d42e544-b623-439f-b4a8-9ee7cb72386c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5d42e544-b623-439f-b4a8-9ee7cb72386c\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.134535 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d42e544-b623-439f-b4a8-9ee7cb72386c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5d42e544-b623-439f-b4a8-9ee7cb72386c\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.134647 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d42e544-b623-439f-b4a8-9ee7cb72386c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5d42e544-b623-439f-b4a8-9ee7cb72386c\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.146074 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqpf4\" (UniqueName: \"kubernetes.io/projected/5d42e544-b623-439f-b4a8-9ee7cb72386c-kube-api-access-jqpf4\") pod \"nova-cell1-conductor-0\" (UID: \"5d42e544-b623-439f-b4a8-9ee7cb72386c\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.189857 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.267813 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sc4zh" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.333723 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-combined-ca-bundle\") pod \"081ac503-fb08-4990-9019-be9c35167de3\" (UID: \"081ac503-fb08-4990-9019-be9c35167de3\") " Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.333775 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-scripts\") pod \"081ac503-fb08-4990-9019-be9c35167de3\" (UID: \"081ac503-fb08-4990-9019-be9c35167de3\") " Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.333871 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-config-data\") pod \"081ac503-fb08-4990-9019-be9c35167de3\" (UID: \"081ac503-fb08-4990-9019-be9c35167de3\") " Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.333969 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfgdt\" (UniqueName: \"kubernetes.io/projected/081ac503-fb08-4990-9019-be9c35167de3-kube-api-access-hfgdt\") pod \"081ac503-fb08-4990-9019-be9c35167de3\" (UID: \"081ac503-fb08-4990-9019-be9c35167de3\") " Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.340282 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/081ac503-fb08-4990-9019-be9c35167de3-kube-api-access-hfgdt" (OuterVolumeSpecName: "kube-api-access-hfgdt") pod "081ac503-fb08-4990-9019-be9c35167de3" (UID: "081ac503-fb08-4990-9019-be9c35167de3"). InnerVolumeSpecName "kube-api-access-hfgdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.340424 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-scripts" (OuterVolumeSpecName: "scripts") pod "081ac503-fb08-4990-9019-be9c35167de3" (UID: "081ac503-fb08-4990-9019-be9c35167de3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.366181 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "081ac503-fb08-4990-9019-be9c35167de3" (UID: "081ac503-fb08-4990-9019-be9c35167de3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.366471 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-config-data" (OuterVolumeSpecName: "config-data") pod "081ac503-fb08-4990-9019-be9c35167de3" (UID: "081ac503-fb08-4990-9019-be9c35167de3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.436416 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfgdt\" (UniqueName: \"kubernetes.io/projected/081ac503-fb08-4990-9019-be9c35167de3-kube-api-access-hfgdt\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.436735 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.436751 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.436762 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/081ac503-fb08-4990-9019-be9c35167de3-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.608379 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.817245 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sc4zh" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.818100 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sc4zh" event={"ID":"081ac503-fb08-4990-9019-be9c35167de3","Type":"ContainerDied","Data":"842a785bf28830ed56adb6606e159f25b20f6724a39b8cdf63c5e97f36f2aae3"} Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.818124 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="842a785bf28830ed56adb6606e159f25b20f6724a39b8cdf63c5e97f36f2aae3" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.819942 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5d42e544-b623-439f-b4a8-9ee7cb72386c","Type":"ContainerStarted","Data":"f5c1643607d5f785e67d366998868d04f2fc17f55081da11a0e1c05f212c36f7"} Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.819978 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5d42e544-b623-439f-b4a8-9ee7cb72386c","Type":"ContainerStarted","Data":"5f5990c5f640703f6826d9ca9068efcba995610c6b2db7ff3b962f3acf749913"} Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.820094 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.840523 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.84050613 podStartE2EDuration="1.84050613s" podCreationTimestamp="2026-01-31 09:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:39.838276868 +0000 UTC m=+950.506960335" watchObservedRunningTime="2026-01-31 09:20:39.84050613 +0000 UTC m=+950.509189599" Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.975540 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.975768 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fbacb461-6b27-45c8-9613-0c58669c2d8e" containerName="nova-api-log" containerID="cri-o://bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b" gracePeriod=30 Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.975980 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fbacb461-6b27-45c8-9613-0c58669c2d8e" containerName="nova-api-api" containerID="cri-o://596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4" gracePeriod=30 Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.985623 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:20:39 crc kubenswrapper[4783]: I0131 09:20:39.985797 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2dc29b34-2773-4aa2-bfd9-2c6a50ca220c" containerName="nova-scheduler-scheduler" containerID="cri-o://f9cc0fc539c2df63ce73fd8f4eb6154c235244ce76fe64f16420864160c5bb5f" gracePeriod=30 Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.444909 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.457511 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbacb461-6b27-45c8-9613-0c58669c2d8e-combined-ca-bundle\") pod \"fbacb461-6b27-45c8-9613-0c58669c2d8e\" (UID: \"fbacb461-6b27-45c8-9613-0c58669c2d8e\") " Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.457564 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbacb461-6b27-45c8-9613-0c58669c2d8e-logs\") pod \"fbacb461-6b27-45c8-9613-0c58669c2d8e\" (UID: \"fbacb461-6b27-45c8-9613-0c58669c2d8e\") " Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.457612 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9fdd\" (UniqueName: \"kubernetes.io/projected/fbacb461-6b27-45c8-9613-0c58669c2d8e-kube-api-access-t9fdd\") pod \"fbacb461-6b27-45c8-9613-0c58669c2d8e\" (UID: \"fbacb461-6b27-45c8-9613-0c58669c2d8e\") " Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.458743 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbacb461-6b27-45c8-9613-0c58669c2d8e-logs" (OuterVolumeSpecName: "logs") pod "fbacb461-6b27-45c8-9613-0c58669c2d8e" (UID: "fbacb461-6b27-45c8-9613-0c58669c2d8e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.462543 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbacb461-6b27-45c8-9613-0c58669c2d8e-kube-api-access-t9fdd" (OuterVolumeSpecName: "kube-api-access-t9fdd") pod "fbacb461-6b27-45c8-9613-0c58669c2d8e" (UID: "fbacb461-6b27-45c8-9613-0c58669c2d8e"). InnerVolumeSpecName "kube-api-access-t9fdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.491884 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbacb461-6b27-45c8-9613-0c58669c2d8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbacb461-6b27-45c8-9613-0c58669c2d8e" (UID: "fbacb461-6b27-45c8-9613-0c58669c2d8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.560290 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbacb461-6b27-45c8-9613-0c58669c2d8e-config-data\") pod \"fbacb461-6b27-45c8-9613-0c58669c2d8e\" (UID: \"fbacb461-6b27-45c8-9613-0c58669c2d8e\") " Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.561856 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9fdd\" (UniqueName: \"kubernetes.io/projected/fbacb461-6b27-45c8-9613-0c58669c2d8e-kube-api-access-t9fdd\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.561886 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbacb461-6b27-45c8-9613-0c58669c2d8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.561897 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbacb461-6b27-45c8-9613-0c58669c2d8e-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.590188 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbacb461-6b27-45c8-9613-0c58669c2d8e-config-data" (OuterVolumeSpecName: "config-data") pod "fbacb461-6b27-45c8-9613-0c58669c2d8e" (UID: "fbacb461-6b27-45c8-9613-0c58669c2d8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.663588 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbacb461-6b27-45c8-9613-0c58669c2d8e-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.828346 4783 generic.go:334] "Generic (PLEG): container finished" podID="fbacb461-6b27-45c8-9613-0c58669c2d8e" containerID="596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4" exitCode=0 Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.828379 4783 generic.go:334] "Generic (PLEG): container finished" podID="fbacb461-6b27-45c8-9613-0c58669c2d8e" containerID="bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b" exitCode=143 Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.828386 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbacb461-6b27-45c8-9613-0c58669c2d8e","Type":"ContainerDied","Data":"596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4"} Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.828439 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbacb461-6b27-45c8-9613-0c58669c2d8e","Type":"ContainerDied","Data":"bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b"} Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.828452 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbacb461-6b27-45c8-9613-0c58669c2d8e","Type":"ContainerDied","Data":"098758239ab3132c80a3b5b065fc04c62dfe18c053712b6e4b6d6bd648387ab7"} Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.828450 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.828469 4783 scope.go:117] "RemoveContainer" containerID="596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.854406 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.855447 4783 scope.go:117] "RemoveContainer" containerID="bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.861470 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.882911 4783 scope.go:117] "RemoveContainer" containerID="596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4" Jan 31 09:20:40 crc kubenswrapper[4783]: E0131 09:20:40.883352 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4\": container with ID starting with 596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4 not found: ID does not exist" containerID="596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.883406 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4"} err="failed to get container status \"596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4\": rpc error: code = NotFound desc = could not find container \"596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4\": container with ID starting with 596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4 not found: ID does not exist" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.883428 4783 scope.go:117] "RemoveContainer" containerID="bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.884411 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 31 09:20:40 crc kubenswrapper[4783]: E0131 09:20:40.884936 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbacb461-6b27-45c8-9613-0c58669c2d8e" containerName="nova-api-api" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.884958 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbacb461-6b27-45c8-9613-0c58669c2d8e" containerName="nova-api-api" Jan 31 09:20:40 crc kubenswrapper[4783]: E0131 09:20:40.884974 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="081ac503-fb08-4990-9019-be9c35167de3" containerName="nova-manage" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.884980 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="081ac503-fb08-4990-9019-be9c35167de3" containerName="nova-manage" Jan 31 09:20:40 crc kubenswrapper[4783]: E0131 09:20:40.884993 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbacb461-6b27-45c8-9613-0c58669c2d8e" containerName="nova-api-log" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.885001 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbacb461-6b27-45c8-9613-0c58669c2d8e" containerName="nova-api-log" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.885656 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbacb461-6b27-45c8-9613-0c58669c2d8e" containerName="nova-api-api" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.885710 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="081ac503-fb08-4990-9019-be9c35167de3" containerName="nova-manage" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.885733 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbacb461-6b27-45c8-9613-0c58669c2d8e" containerName="nova-api-log" Jan 31 09:20:40 crc kubenswrapper[4783]: E0131 09:20:40.886590 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b\": container with ID starting with bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b not found: ID does not exist" containerID="bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.886617 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b"} err="failed to get container status \"bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b\": rpc error: code = NotFound desc = could not find container \"bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b\": container with ID starting with bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b not found: ID does not exist" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.886637 4783 scope.go:117] "RemoveContainer" containerID="596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.886891 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.888277 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4"} err="failed to get container status \"596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4\": rpc error: code = NotFound desc = could not find container \"596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4\": container with ID starting with 596cbd43020d713eb068fa5cd088a79f7b62ba31a4db32ee3ac565e58cd189d4 not found: ID does not exist" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.891581 4783 scope.go:117] "RemoveContainer" containerID="bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.888633 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.898913 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:20:40 crc kubenswrapper[4783]: I0131 09:20:40.900250 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b"} err="failed to get container status \"bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b\": rpc error: code = NotFound desc = could not find container \"bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b\": container with ID starting with bd0ea24ace960d3620251d07c038cda62702bd6c9d8ff470866d4dd197eb318b not found: ID does not exist" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.086862 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\") " pod="openstack/nova-api-0" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.087102 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-logs\") pod \"nova-api-0\" (UID: \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\") " pod="openstack/nova-api-0" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.087256 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-config-data\") pod \"nova-api-0\" (UID: \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\") " pod="openstack/nova-api-0" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.087301 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv4h7\" (UniqueName: \"kubernetes.io/projected/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-kube-api-access-nv4h7\") pod \"nova-api-0\" (UID: \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\") " pod="openstack/nova-api-0" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.149960 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.189626 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-logs\") pod \"nova-api-0\" (UID: \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\") " pod="openstack/nova-api-0" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.189702 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-config-data\") pod \"nova-api-0\" (UID: \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\") " pod="openstack/nova-api-0" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.189728 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv4h7\" (UniqueName: \"kubernetes.io/projected/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-kube-api-access-nv4h7\") pod \"nova-api-0\" (UID: \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\") " pod="openstack/nova-api-0" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.189807 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\") " pod="openstack/nova-api-0" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.191835 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-logs\") pod \"nova-api-0\" (UID: \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\") " pod="openstack/nova-api-0" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.203461 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-config-data\") pod \"nova-api-0\" (UID: \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\") " pod="openstack/nova-api-0" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.216747 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv4h7\" (UniqueName: \"kubernetes.io/projected/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-kube-api-access-nv4h7\") pod \"nova-api-0\" (UID: \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\") " pod="openstack/nova-api-0" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.217701 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\") " pod="openstack/nova-api-0" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.218368 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-ck9lr"] Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.218740 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" podUID="d3f5feb6-3f16-411f-8bea-22a5badd3b44" containerName="dnsmasq-dns" containerID="cri-o://78b26e2e79fe847f647a8824f0c43055c512862d8555688726ab9c901f71e964" gracePeriod=10 Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.515583 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.657054 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbacb461-6b27-45c8-9613-0c58669c2d8e" path="/var/lib/kubelet/pods/fbacb461-6b27-45c8-9613-0c58669c2d8e/volumes" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.734504 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.802777 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk8km\" (UniqueName: \"kubernetes.io/projected/d3f5feb6-3f16-411f-8bea-22a5badd3b44-kube-api-access-xk8km\") pod \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.802854 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-ovsdbserver-sb\") pod \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.802909 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-ovsdbserver-nb\") pod \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.802937 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-config\") pod \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.802959 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-dns-swift-storage-0\") pod \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.803064 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-dns-svc\") pod \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\" (UID: \"d3f5feb6-3f16-411f-8bea-22a5badd3b44\") " Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.808491 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f5feb6-3f16-411f-8bea-22a5badd3b44-kube-api-access-xk8km" (OuterVolumeSpecName: "kube-api-access-xk8km") pod "d3f5feb6-3f16-411f-8bea-22a5badd3b44" (UID: "d3f5feb6-3f16-411f-8bea-22a5badd3b44"). InnerVolumeSpecName "kube-api-access-xk8km". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.843293 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d3f5feb6-3f16-411f-8bea-22a5badd3b44" (UID: "d3f5feb6-3f16-411f-8bea-22a5badd3b44"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.843710 4783 generic.go:334] "Generic (PLEG): container finished" podID="d3f5feb6-3f16-411f-8bea-22a5badd3b44" containerID="78b26e2e79fe847f647a8824f0c43055c512862d8555688726ab9c901f71e964" exitCode=0 Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.843748 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" event={"ID":"d3f5feb6-3f16-411f-8bea-22a5badd3b44","Type":"ContainerDied","Data":"78b26e2e79fe847f647a8824f0c43055c512862d8555688726ab9c901f71e964"} Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.843775 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" event={"ID":"d3f5feb6-3f16-411f-8bea-22a5badd3b44","Type":"ContainerDied","Data":"0c117cbda33455270d4d9caa2b7fe2b38f98e246fe8ae13199fe34d55837efd2"} Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.843795 4783 scope.go:117] "RemoveContainer" containerID="78b26e2e79fe847f647a8824f0c43055c512862d8555688726ab9c901f71e964" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.843960 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-ck9lr" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.851707 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-config" (OuterVolumeSpecName: "config") pod "d3f5feb6-3f16-411f-8bea-22a5badd3b44" (UID: "d3f5feb6-3f16-411f-8bea-22a5badd3b44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.853031 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d3f5feb6-3f16-411f-8bea-22a5badd3b44" (UID: "d3f5feb6-3f16-411f-8bea-22a5badd3b44"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.854573 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d3f5feb6-3f16-411f-8bea-22a5badd3b44" (UID: "d3f5feb6-3f16-411f-8bea-22a5badd3b44"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.862405 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d3f5feb6-3f16-411f-8bea-22a5badd3b44" (UID: "d3f5feb6-3f16-411f-8bea-22a5badd3b44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.875097 4783 scope.go:117] "RemoveContainer" containerID="8b3fe46c4e5e2878ac60abc51b076bcb93568b4db9c3fc5df7ce21d3ccb5b2a8" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.897924 4783 scope.go:117] "RemoveContainer" containerID="78b26e2e79fe847f647a8824f0c43055c512862d8555688726ab9c901f71e964" Jan 31 09:20:41 crc kubenswrapper[4783]: E0131 09:20:41.898305 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78b26e2e79fe847f647a8824f0c43055c512862d8555688726ab9c901f71e964\": container with ID starting with 78b26e2e79fe847f647a8824f0c43055c512862d8555688726ab9c901f71e964 not found: ID does not exist" containerID="78b26e2e79fe847f647a8824f0c43055c512862d8555688726ab9c901f71e964" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.898337 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78b26e2e79fe847f647a8824f0c43055c512862d8555688726ab9c901f71e964"} err="failed to get container status \"78b26e2e79fe847f647a8824f0c43055c512862d8555688726ab9c901f71e964\": rpc error: code = NotFound desc = could not find container \"78b26e2e79fe847f647a8824f0c43055c512862d8555688726ab9c901f71e964\": container with ID starting with 78b26e2e79fe847f647a8824f0c43055c512862d8555688726ab9c901f71e964 not found: ID does not exist" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.898356 4783 scope.go:117] "RemoveContainer" containerID="8b3fe46c4e5e2878ac60abc51b076bcb93568b4db9c3fc5df7ce21d3ccb5b2a8" Jan 31 09:20:41 crc kubenswrapper[4783]: E0131 09:20:41.898686 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b3fe46c4e5e2878ac60abc51b076bcb93568b4db9c3fc5df7ce21d3ccb5b2a8\": container with ID starting with 8b3fe46c4e5e2878ac60abc51b076bcb93568b4db9c3fc5df7ce21d3ccb5b2a8 not found: ID does not exist" containerID="8b3fe46c4e5e2878ac60abc51b076bcb93568b4db9c3fc5df7ce21d3ccb5b2a8" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.898728 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3fe46c4e5e2878ac60abc51b076bcb93568b4db9c3fc5df7ce21d3ccb5b2a8"} err="failed to get container status \"8b3fe46c4e5e2878ac60abc51b076bcb93568b4db9c3fc5df7ce21d3ccb5b2a8\": rpc error: code = NotFound desc = could not find container \"8b3fe46c4e5e2878ac60abc51b076bcb93568b4db9c3fc5df7ce21d3ccb5b2a8\": container with ID starting with 8b3fe46c4e5e2878ac60abc51b076bcb93568b4db9c3fc5df7ce21d3ccb5b2a8 not found: ID does not exist" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.905211 4783 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.905231 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk8km\" (UniqueName: \"kubernetes.io/projected/d3f5feb6-3f16-411f-8bea-22a5badd3b44-kube-api-access-xk8km\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.905240 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.905249 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.905259 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.905267 4783 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3f5feb6-3f16-411f-8bea-22a5badd3b44-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:41 crc kubenswrapper[4783]: I0131 09:20:41.956650 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:20:42 crc kubenswrapper[4783]: W0131 09:20:42.026291 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95349f97_4f73_4e01_9e3c_e7c61fee8f5b.slice/crio-fd83bcbd259213dba42fc9ff0f8faad08a42f76eb883bfcb16628dc37cc59c26 WatchSource:0}: Error finding container fd83bcbd259213dba42fc9ff0f8faad08a42f76eb883bfcb16628dc37cc59c26: Status 404 returned error can't find the container with id fd83bcbd259213dba42fc9ff0f8faad08a42f76eb883bfcb16628dc37cc59c26 Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.190529 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-ck9lr"] Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.196332 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-ck9lr"] Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.376855 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.518285 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7kx5\" (UniqueName: \"kubernetes.io/projected/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-kube-api-access-m7kx5\") pod \"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c\" (UID: \"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c\") " Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.518328 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-config-data\") pod \"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c\" (UID: \"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c\") " Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.518368 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-combined-ca-bundle\") pod \"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c\" (UID: \"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c\") " Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.524672 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-kube-api-access-m7kx5" (OuterVolumeSpecName: "kube-api-access-m7kx5") pod "2dc29b34-2773-4aa2-bfd9-2c6a50ca220c" (UID: "2dc29b34-2773-4aa2-bfd9-2c6a50ca220c"). InnerVolumeSpecName "kube-api-access-m7kx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.539918 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2dc29b34-2773-4aa2-bfd9-2c6a50ca220c" (UID: "2dc29b34-2773-4aa2-bfd9-2c6a50ca220c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.542220 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-config-data" (OuterVolumeSpecName: "config-data") pod "2dc29b34-2773-4aa2-bfd9-2c6a50ca220c" (UID: "2dc29b34-2773-4aa2-bfd9-2c6a50ca220c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.622071 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7kx5\" (UniqueName: \"kubernetes.io/projected/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-kube-api-access-m7kx5\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.622101 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.622113 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.856589 4783 generic.go:334] "Generic (PLEG): container finished" podID="2dc29b34-2773-4aa2-bfd9-2c6a50ca220c" containerID="f9cc0fc539c2df63ce73fd8f4eb6154c235244ce76fe64f16420864160c5bb5f" exitCode=0 Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.856638 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.856668 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c","Type":"ContainerDied","Data":"f9cc0fc539c2df63ce73fd8f4eb6154c235244ce76fe64f16420864160c5bb5f"} Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.856712 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2dc29b34-2773-4aa2-bfd9-2c6a50ca220c","Type":"ContainerDied","Data":"bcce2912a72f402cc758abd63abefa51d8fc82724256b0ac570ffee6de52901f"} Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.856757 4783 scope.go:117] "RemoveContainer" containerID="f9cc0fc539c2df63ce73fd8f4eb6154c235244ce76fe64f16420864160c5bb5f" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.861655 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95349f97-4f73-4e01-9e3c-e7c61fee8f5b","Type":"ContainerStarted","Data":"bdcbed1ab20606fd9a478320b70ca67ce7f4c4d4096cbc2ca299a71d029c9141"} Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.861716 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95349f97-4f73-4e01-9e3c-e7c61fee8f5b","Type":"ContainerStarted","Data":"ef264af2a9d87194f3f2c9d96525862e3f585a640d13ba37ee8212eabb41a4ee"} Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.861729 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95349f97-4f73-4e01-9e3c-e7c61fee8f5b","Type":"ContainerStarted","Data":"fd83bcbd259213dba42fc9ff0f8faad08a42f76eb883bfcb16628dc37cc59c26"} Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.885228 4783 scope.go:117] "RemoveContainer" containerID="f9cc0fc539c2df63ce73fd8f4eb6154c235244ce76fe64f16420864160c5bb5f" Jan 31 09:20:42 crc kubenswrapper[4783]: E0131 09:20:42.885625 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9cc0fc539c2df63ce73fd8f4eb6154c235244ce76fe64f16420864160c5bb5f\": container with ID starting with f9cc0fc539c2df63ce73fd8f4eb6154c235244ce76fe64f16420864160c5bb5f not found: ID does not exist" containerID="f9cc0fc539c2df63ce73fd8f4eb6154c235244ce76fe64f16420864160c5bb5f" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.885708 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9cc0fc539c2df63ce73fd8f4eb6154c235244ce76fe64f16420864160c5bb5f"} err="failed to get container status \"f9cc0fc539c2df63ce73fd8f4eb6154c235244ce76fe64f16420864160c5bb5f\": rpc error: code = NotFound desc = could not find container \"f9cc0fc539c2df63ce73fd8f4eb6154c235244ce76fe64f16420864160c5bb5f\": container with ID starting with f9cc0fc539c2df63ce73fd8f4eb6154c235244ce76fe64f16420864160c5bb5f not found: ID does not exist" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.888292 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.888278957 podStartE2EDuration="2.888278957s" podCreationTimestamp="2026-01-31 09:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:42.880069612 +0000 UTC m=+953.548753090" watchObservedRunningTime="2026-01-31 09:20:42.888278957 +0000 UTC m=+953.556962425" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.907016 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.919277 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.948724 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:20:42 crc kubenswrapper[4783]: E0131 09:20:42.949419 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f5feb6-3f16-411f-8bea-22a5badd3b44" containerName="init" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.949449 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f5feb6-3f16-411f-8bea-22a5badd3b44" containerName="init" Jan 31 09:20:42 crc kubenswrapper[4783]: E0131 09:20:42.949499 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc29b34-2773-4aa2-bfd9-2c6a50ca220c" containerName="nova-scheduler-scheduler" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.949505 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc29b34-2773-4aa2-bfd9-2c6a50ca220c" containerName="nova-scheduler-scheduler" Jan 31 09:20:42 crc kubenswrapper[4783]: E0131 09:20:42.949515 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f5feb6-3f16-411f-8bea-22a5badd3b44" containerName="dnsmasq-dns" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.949521 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f5feb6-3f16-411f-8bea-22a5badd3b44" containerName="dnsmasq-dns" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.950214 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f5feb6-3f16-411f-8bea-22a5badd3b44" containerName="dnsmasq-dns" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.950231 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc29b34-2773-4aa2-bfd9-2c6a50ca220c" containerName="nova-scheduler-scheduler" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.951144 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.954464 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 31 09:20:42 crc kubenswrapper[4783]: I0131 09:20:42.957916 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:20:43 crc kubenswrapper[4783]: I0131 09:20:43.142365 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-config-data\") pod \"nova-scheduler-0\" (UID: \"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:43 crc kubenswrapper[4783]: I0131 09:20:43.142564 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tszvw\" (UniqueName: \"kubernetes.io/projected/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-kube-api-access-tszvw\") pod \"nova-scheduler-0\" (UID: \"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:43 crc kubenswrapper[4783]: I0131 09:20:43.142874 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:43 crc kubenswrapper[4783]: I0131 09:20:43.244607 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:43 crc kubenswrapper[4783]: I0131 09:20:43.245320 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-config-data\") pod \"nova-scheduler-0\" (UID: \"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:43 crc kubenswrapper[4783]: I0131 09:20:43.245450 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tszvw\" (UniqueName: \"kubernetes.io/projected/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-kube-api-access-tszvw\") pod \"nova-scheduler-0\" (UID: \"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:43 crc kubenswrapper[4783]: I0131 09:20:43.249960 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:43 crc kubenswrapper[4783]: I0131 09:20:43.250344 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-config-data\") pod \"nova-scheduler-0\" (UID: \"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:43 crc kubenswrapper[4783]: I0131 09:20:43.261906 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tszvw\" (UniqueName: \"kubernetes.io/projected/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-kube-api-access-tszvw\") pod \"nova-scheduler-0\" (UID: \"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2\") " pod="openstack/nova-scheduler-0" Jan 31 09:20:43 crc kubenswrapper[4783]: I0131 09:20:43.266113 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:20:43 crc kubenswrapper[4783]: I0131 09:20:43.654641 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dc29b34-2773-4aa2-bfd9-2c6a50ca220c" path="/var/lib/kubelet/pods/2dc29b34-2773-4aa2-bfd9-2c6a50ca220c/volumes" Jan 31 09:20:43 crc kubenswrapper[4783]: I0131 09:20:43.655405 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3f5feb6-3f16-411f-8bea-22a5badd3b44" path="/var/lib/kubelet/pods/d3f5feb6-3f16-411f-8bea-22a5badd3b44/volumes" Jan 31 09:20:43 crc kubenswrapper[4783]: I0131 09:20:43.657537 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:20:43 crc kubenswrapper[4783]: I0131 09:20:43.874385 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2","Type":"ContainerStarted","Data":"6736ca4a45576d94914954709afc54430d98d339059bd0381c2ab6a871da8c91"} Jan 31 09:20:43 crc kubenswrapper[4783]: I0131 09:20:43.874440 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2","Type":"ContainerStarted","Data":"21344fbbd395ed610598b0db46ddbdb41b125a37903c64a83b90e25635ab7076"} Jan 31 09:20:43 crc kubenswrapper[4783]: I0131 09:20:43.891310 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.89128812 podStartE2EDuration="1.89128812s" podCreationTimestamp="2026-01-31 09:20:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:43.890288175 +0000 UTC m=+954.558971643" watchObservedRunningTime="2026-01-31 09:20:43.89128812 +0000 UTC m=+954.559971588" Jan 31 09:20:44 crc kubenswrapper[4783]: I0131 09:20:44.212890 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 31 09:20:44 crc kubenswrapper[4783]: I0131 09:20:44.829914 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 31 09:20:47 crc kubenswrapper[4783]: I0131 09:20:47.756855 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:20:47 crc kubenswrapper[4783]: I0131 09:20:47.757254 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:20:47 crc kubenswrapper[4783]: I0131 09:20:47.757309 4783 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:20:47 crc kubenswrapper[4783]: I0131 09:20:47.758538 4783 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a28aa009c9b1798b35b666e609764f43f71694d4a62c6d2fec1ffdd0fb94bbed"} pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:20:47 crc kubenswrapper[4783]: I0131 09:20:47.758661 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" containerID="cri-o://a28aa009c9b1798b35b666e609764f43f71694d4a62c6d2fec1ffdd0fb94bbed" gracePeriod=600 Jan 31 09:20:47 crc kubenswrapper[4783]: I0131 09:20:47.914510 4783 generic.go:334] "Generic (PLEG): container finished" podID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerID="a28aa009c9b1798b35b666e609764f43f71694d4a62c6d2fec1ffdd0fb94bbed" exitCode=0 Jan 31 09:20:47 crc kubenswrapper[4783]: I0131 09:20:47.914586 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerDied","Data":"a28aa009c9b1798b35b666e609764f43f71694d4a62c6d2fec1ffdd0fb94bbed"} Jan 31 09:20:47 crc kubenswrapper[4783]: I0131 09:20:47.914887 4783 scope.go:117] "RemoveContainer" containerID="63dec065c0e2ce55bb88687151f12c6eb92203eb247bb4dce8e626a9b6254663" Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.003308 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.003550 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4d24bc22-990e-4f9f-a39b-f16adc63dfbb" containerName="kube-state-metrics" containerID="cri-o://06197717bda753ea4439b9fc74f9a7109ffaae91b7a2da7633d06ef2812f0179" gracePeriod=30 Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.266875 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.565064 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.581897 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsx6s\" (UniqueName: \"kubernetes.io/projected/4d24bc22-990e-4f9f-a39b-f16adc63dfbb-kube-api-access-xsx6s\") pod \"4d24bc22-990e-4f9f-a39b-f16adc63dfbb\" (UID: \"4d24bc22-990e-4f9f-a39b-f16adc63dfbb\") " Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.588051 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d24bc22-990e-4f9f-a39b-f16adc63dfbb-kube-api-access-xsx6s" (OuterVolumeSpecName: "kube-api-access-xsx6s") pod "4d24bc22-990e-4f9f-a39b-f16adc63dfbb" (UID: "4d24bc22-990e-4f9f-a39b-f16adc63dfbb"). InnerVolumeSpecName "kube-api-access-xsx6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.685093 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsx6s\" (UniqueName: \"kubernetes.io/projected/4d24bc22-990e-4f9f-a39b-f16adc63dfbb-kube-api-access-xsx6s\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.925109 4783 generic.go:334] "Generic (PLEG): container finished" podID="4d24bc22-990e-4f9f-a39b-f16adc63dfbb" containerID="06197717bda753ea4439b9fc74f9a7109ffaae91b7a2da7633d06ef2812f0179" exitCode=2 Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.925207 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d24bc22-990e-4f9f-a39b-f16adc63dfbb","Type":"ContainerDied","Data":"06197717bda753ea4439b9fc74f9a7109ffaae91b7a2da7633d06ef2812f0179"} Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.925248 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d24bc22-990e-4f9f-a39b-f16adc63dfbb","Type":"ContainerDied","Data":"69b8c1bf328e4446bd4627afbb35c253bb17283816be6385a9658223c6791896"} Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.925269 4783 scope.go:117] "RemoveContainer" containerID="06197717bda753ea4439b9fc74f9a7109ffaae91b7a2da7633d06ef2812f0179" Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.925372 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.928696 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerStarted","Data":"59c827a13c686c020544e19ef18874c9811559e467147dcdea6ae441d681ed0d"} Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.949669 4783 scope.go:117] "RemoveContainer" containerID="06197717bda753ea4439b9fc74f9a7109ffaae91b7a2da7633d06ef2812f0179" Jan 31 09:20:48 crc kubenswrapper[4783]: E0131 09:20:48.950097 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06197717bda753ea4439b9fc74f9a7109ffaae91b7a2da7633d06ef2812f0179\": container with ID starting with 06197717bda753ea4439b9fc74f9a7109ffaae91b7a2da7633d06ef2812f0179 not found: ID does not exist" containerID="06197717bda753ea4439b9fc74f9a7109ffaae91b7a2da7633d06ef2812f0179" Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.950140 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06197717bda753ea4439b9fc74f9a7109ffaae91b7a2da7633d06ef2812f0179"} err="failed to get container status \"06197717bda753ea4439b9fc74f9a7109ffaae91b7a2da7633d06ef2812f0179\": rpc error: code = NotFound desc = could not find container \"06197717bda753ea4439b9fc74f9a7109ffaae91b7a2da7633d06ef2812f0179\": container with ID starting with 06197717bda753ea4439b9fc74f9a7109ffaae91b7a2da7633d06ef2812f0179 not found: ID does not exist" Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.972487 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.983417 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.990451 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:20:48 crc kubenswrapper[4783]: E0131 09:20:48.990866 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d24bc22-990e-4f9f-a39b-f16adc63dfbb" containerName="kube-state-metrics" Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.990884 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d24bc22-990e-4f9f-a39b-f16adc63dfbb" containerName="kube-state-metrics" Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.991097 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d24bc22-990e-4f9f-a39b-f16adc63dfbb" containerName="kube-state-metrics" Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.991781 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.993801 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 31 09:20:48 crc kubenswrapper[4783]: I0131 09:20:48.994484 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.000270 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.090672 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvwk7\" (UniqueName: \"kubernetes.io/projected/7c07732e-abfe-48cc-86c9-b500fff4977d-kube-api-access-mvwk7\") pod \"kube-state-metrics-0\" (UID: \"7c07732e-abfe-48cc-86c9-b500fff4977d\") " pod="openstack/kube-state-metrics-0" Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.090730 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c07732e-abfe-48cc-86c9-b500fff4977d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7c07732e-abfe-48cc-86c9-b500fff4977d\") " pod="openstack/kube-state-metrics-0" Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.090790 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7c07732e-abfe-48cc-86c9-b500fff4977d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7c07732e-abfe-48cc-86c9-b500fff4977d\") " pod="openstack/kube-state-metrics-0" Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.090822 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c07732e-abfe-48cc-86c9-b500fff4977d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7c07732e-abfe-48cc-86c9-b500fff4977d\") " pod="openstack/kube-state-metrics-0" Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.192732 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvwk7\" (UniqueName: \"kubernetes.io/projected/7c07732e-abfe-48cc-86c9-b500fff4977d-kube-api-access-mvwk7\") pod \"kube-state-metrics-0\" (UID: \"7c07732e-abfe-48cc-86c9-b500fff4977d\") " pod="openstack/kube-state-metrics-0" Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.193097 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c07732e-abfe-48cc-86c9-b500fff4977d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7c07732e-abfe-48cc-86c9-b500fff4977d\") " pod="openstack/kube-state-metrics-0" Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.193144 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7c07732e-abfe-48cc-86c9-b500fff4977d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7c07732e-abfe-48cc-86c9-b500fff4977d\") " pod="openstack/kube-state-metrics-0" Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.193212 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c07732e-abfe-48cc-86c9-b500fff4977d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7c07732e-abfe-48cc-86c9-b500fff4977d\") " pod="openstack/kube-state-metrics-0" Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.198636 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c07732e-abfe-48cc-86c9-b500fff4977d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7c07732e-abfe-48cc-86c9-b500fff4977d\") " pod="openstack/kube-state-metrics-0" Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.198641 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7c07732e-abfe-48cc-86c9-b500fff4977d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7c07732e-abfe-48cc-86c9-b500fff4977d\") " pod="openstack/kube-state-metrics-0" Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.202411 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c07732e-abfe-48cc-86c9-b500fff4977d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7c07732e-abfe-48cc-86c9-b500fff4977d\") " pod="openstack/kube-state-metrics-0" Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.206432 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvwk7\" (UniqueName: \"kubernetes.io/projected/7c07732e-abfe-48cc-86c9-b500fff4977d-kube-api-access-mvwk7\") pod \"kube-state-metrics-0\" (UID: \"7c07732e-abfe-48cc-86c9-b500fff4977d\") " pod="openstack/kube-state-metrics-0" Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.313767 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.660496 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d24bc22-990e-4f9f-a39b-f16adc63dfbb" path="/var/lib/kubelet/pods/4d24bc22-990e-4f9f-a39b-f16adc63dfbb/volumes" Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.738434 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.929484 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.930011 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a25966a5-bd6e-458d-b002-130480d742a5" containerName="ceilometer-central-agent" containerID="cri-o://ba6765c2eb8c111768ee94c4b9ea61a597f791dc47a2183ff754f80bc760c583" gracePeriod=30 Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.930085 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a25966a5-bd6e-458d-b002-130480d742a5" containerName="proxy-httpd" containerID="cri-o://1993a5b26e2bf319d7f5b3ea0869e4c5daa4cfea3eb347f8472fd8e4dd3fe718" gracePeriod=30 Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.930124 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a25966a5-bd6e-458d-b002-130480d742a5" containerName="sg-core" containerID="cri-o://341353603d3543f03e4712085bad09cb28ab70bdc64398275ec4e78fbe91b753" gracePeriod=30 Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.930217 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a25966a5-bd6e-458d-b002-130480d742a5" containerName="ceilometer-notification-agent" containerID="cri-o://1bc1a13abc159d9a482463a4657c2c98b0c2cff2fa7152f086f5273895cc0538" gracePeriod=30 Jan 31 09:20:49 crc kubenswrapper[4783]: I0131 09:20:49.941550 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7c07732e-abfe-48cc-86c9-b500fff4977d","Type":"ContainerStarted","Data":"8671a010df2ee569cae2f04ddb3b8001674a083c81ac98299bcb0f7e2d98348d"} Jan 31 09:20:50 crc kubenswrapper[4783]: I0131 09:20:50.952830 4783 generic.go:334] "Generic (PLEG): container finished" podID="a25966a5-bd6e-458d-b002-130480d742a5" containerID="1993a5b26e2bf319d7f5b3ea0869e4c5daa4cfea3eb347f8472fd8e4dd3fe718" exitCode=0 Jan 31 09:20:50 crc kubenswrapper[4783]: I0131 09:20:50.953549 4783 generic.go:334] "Generic (PLEG): container finished" podID="a25966a5-bd6e-458d-b002-130480d742a5" containerID="341353603d3543f03e4712085bad09cb28ab70bdc64398275ec4e78fbe91b753" exitCode=2 Jan 31 09:20:50 crc kubenswrapper[4783]: I0131 09:20:50.952898 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25966a5-bd6e-458d-b002-130480d742a5","Type":"ContainerDied","Data":"1993a5b26e2bf319d7f5b3ea0869e4c5daa4cfea3eb347f8472fd8e4dd3fe718"} Jan 31 09:20:50 crc kubenswrapper[4783]: I0131 09:20:50.953615 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25966a5-bd6e-458d-b002-130480d742a5","Type":"ContainerDied","Data":"341353603d3543f03e4712085bad09cb28ab70bdc64398275ec4e78fbe91b753"} Jan 31 09:20:50 crc kubenswrapper[4783]: I0131 09:20:50.953630 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25966a5-bd6e-458d-b002-130480d742a5","Type":"ContainerDied","Data":"ba6765c2eb8c111768ee94c4b9ea61a597f791dc47a2183ff754f80bc760c583"} Jan 31 09:20:50 crc kubenswrapper[4783]: I0131 09:20:50.953576 4783 generic.go:334] "Generic (PLEG): container finished" podID="a25966a5-bd6e-458d-b002-130480d742a5" containerID="ba6765c2eb8c111768ee94c4b9ea61a597f791dc47a2183ff754f80bc760c583" exitCode=0 Jan 31 09:20:50 crc kubenswrapper[4783]: I0131 09:20:50.956228 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7c07732e-abfe-48cc-86c9-b500fff4977d","Type":"ContainerStarted","Data":"7ef11c3b9d7559146e5133da14404624cad69d8bfe650540a4c90a29a434e27b"} Jan 31 09:20:50 crc kubenswrapper[4783]: I0131 09:20:50.956384 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 31 09:20:50 crc kubenswrapper[4783]: I0131 09:20:50.976480 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.659493479 podStartE2EDuration="2.976465759s" podCreationTimestamp="2026-01-31 09:20:48 +0000 UTC" firstStartedPulling="2026-01-31 09:20:49.741901408 +0000 UTC m=+960.410584876" lastFinishedPulling="2026-01-31 09:20:50.058873688 +0000 UTC m=+960.727557156" observedRunningTime="2026-01-31 09:20:50.96961766 +0000 UTC m=+961.638301128" watchObservedRunningTime="2026-01-31 09:20:50.976465759 +0000 UTC m=+961.645149218" Jan 31 09:20:51 crc kubenswrapper[4783]: I0131 09:20:51.517005 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 31 09:20:51 crc kubenswrapper[4783]: I0131 09:20:51.517290 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 31 09:20:52 crc kubenswrapper[4783]: I0131 09:20:52.599401 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="95349f97-4f73-4e01-9e3c-e7c61fee8f5b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 31 09:20:52 crc kubenswrapper[4783]: I0131 09:20:52.599514 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="95349f97-4f73-4e01-9e3c-e7c61fee8f5b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 31 09:20:53 crc kubenswrapper[4783]: I0131 09:20:53.267309 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 31 09:20:53 crc kubenswrapper[4783]: I0131 09:20:53.301199 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 31 09:20:54 crc kubenswrapper[4783]: I0131 09:20:54.007734 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 31 09:20:57 crc kubenswrapper[4783]: I0131 09:20:57.961707 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:20:57 crc kubenswrapper[4783]: I0131 09:20:57.988087 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25966a5-bd6e-458d-b002-130480d742a5-log-httpd\") pod \"a25966a5-bd6e-458d-b002-130480d742a5\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " Jan 31 09:20:57 crc kubenswrapper[4783]: I0131 09:20:57.988177 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-combined-ca-bundle\") pod \"a25966a5-bd6e-458d-b002-130480d742a5\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " Jan 31 09:20:57 crc kubenswrapper[4783]: I0131 09:20:57.988258 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-config-data\") pod \"a25966a5-bd6e-458d-b002-130480d742a5\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " Jan 31 09:20:57 crc kubenswrapper[4783]: I0131 09:20:57.988317 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-scripts\") pod \"a25966a5-bd6e-458d-b002-130480d742a5\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " Jan 31 09:20:57 crc kubenswrapper[4783]: I0131 09:20:57.988485 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-sg-core-conf-yaml\") pod \"a25966a5-bd6e-458d-b002-130480d742a5\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " Jan 31 09:20:57 crc kubenswrapper[4783]: I0131 09:20:57.988513 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25966a5-bd6e-458d-b002-130480d742a5-run-httpd\") pod \"a25966a5-bd6e-458d-b002-130480d742a5\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " Jan 31 09:20:57 crc kubenswrapper[4783]: I0131 09:20:57.988552 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbkc5\" (UniqueName: \"kubernetes.io/projected/a25966a5-bd6e-458d-b002-130480d742a5-kube-api-access-mbkc5\") pod \"a25966a5-bd6e-458d-b002-130480d742a5\" (UID: \"a25966a5-bd6e-458d-b002-130480d742a5\") " Jan 31 09:20:57 crc kubenswrapper[4783]: I0131 09:20:57.989592 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a25966a5-bd6e-458d-b002-130480d742a5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a25966a5-bd6e-458d-b002-130480d742a5" (UID: "a25966a5-bd6e-458d-b002-130480d742a5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:20:57 crc kubenswrapper[4783]: I0131 09:20:57.990283 4783 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25966a5-bd6e-458d-b002-130480d742a5-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:57 crc kubenswrapper[4783]: I0131 09:20:57.994459 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a25966a5-bd6e-458d-b002-130480d742a5-kube-api-access-mbkc5" (OuterVolumeSpecName: "kube-api-access-mbkc5") pod "a25966a5-bd6e-458d-b002-130480d742a5" (UID: "a25966a5-bd6e-458d-b002-130480d742a5"). InnerVolumeSpecName "kube-api-access-mbkc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:57 crc kubenswrapper[4783]: I0131 09:20:57.994735 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-scripts" (OuterVolumeSpecName: "scripts") pod "a25966a5-bd6e-458d-b002-130480d742a5" (UID: "a25966a5-bd6e-458d-b002-130480d742a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.001048 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a25966a5-bd6e-458d-b002-130480d742a5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a25966a5-bd6e-458d-b002-130480d742a5" (UID: "a25966a5-bd6e-458d-b002-130480d742a5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.026867 4783 generic.go:334] "Generic (PLEG): container finished" podID="a25966a5-bd6e-458d-b002-130480d742a5" containerID="1bc1a13abc159d9a482463a4657c2c98b0c2cff2fa7152f086f5273895cc0538" exitCode=0 Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.026944 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25966a5-bd6e-458d-b002-130480d742a5","Type":"ContainerDied","Data":"1bc1a13abc159d9a482463a4657c2c98b0c2cff2fa7152f086f5273895cc0538"} Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.026982 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25966a5-bd6e-458d-b002-130480d742a5","Type":"ContainerDied","Data":"6670f8aa5b9439162a0ed9c89d67d90ca1daefa0095473ede4c0b52e33e51b44"} Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.027002 4783 scope.go:117] "RemoveContainer" containerID="1993a5b26e2bf319d7f5b3ea0869e4c5daa4cfea3eb347f8472fd8e4dd3fe718" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.027013 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.032290 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a25966a5-bd6e-458d-b002-130480d742a5" (UID: "a25966a5-bd6e-458d-b002-130480d742a5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.049507 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a25966a5-bd6e-458d-b002-130480d742a5" (UID: "a25966a5-bd6e-458d-b002-130480d742a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.073111 4783 scope.go:117] "RemoveContainer" containerID="341353603d3543f03e4712085bad09cb28ab70bdc64398275ec4e78fbe91b753" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.081085 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-config-data" (OuterVolumeSpecName: "config-data") pod "a25966a5-bd6e-458d-b002-130480d742a5" (UID: "a25966a5-bd6e-458d-b002-130480d742a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.087176 4783 scope.go:117] "RemoveContainer" containerID="1bc1a13abc159d9a482463a4657c2c98b0c2cff2fa7152f086f5273895cc0538" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.091825 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbkc5\" (UniqueName: \"kubernetes.io/projected/a25966a5-bd6e-458d-b002-130480d742a5-kube-api-access-mbkc5\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.091855 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.091866 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.091877 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.091887 4783 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a25966a5-bd6e-458d-b002-130480d742a5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.091895 4783 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25966a5-bd6e-458d-b002-130480d742a5-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.103678 4783 scope.go:117] "RemoveContainer" containerID="ba6765c2eb8c111768ee94c4b9ea61a597f791dc47a2183ff754f80bc760c583" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.118833 4783 scope.go:117] "RemoveContainer" containerID="1993a5b26e2bf319d7f5b3ea0869e4c5daa4cfea3eb347f8472fd8e4dd3fe718" Jan 31 09:20:58 crc kubenswrapper[4783]: E0131 09:20:58.119257 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1993a5b26e2bf319d7f5b3ea0869e4c5daa4cfea3eb347f8472fd8e4dd3fe718\": container with ID starting with 1993a5b26e2bf319d7f5b3ea0869e4c5daa4cfea3eb347f8472fd8e4dd3fe718 not found: ID does not exist" containerID="1993a5b26e2bf319d7f5b3ea0869e4c5daa4cfea3eb347f8472fd8e4dd3fe718" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.119381 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1993a5b26e2bf319d7f5b3ea0869e4c5daa4cfea3eb347f8472fd8e4dd3fe718"} err="failed to get container status \"1993a5b26e2bf319d7f5b3ea0869e4c5daa4cfea3eb347f8472fd8e4dd3fe718\": rpc error: code = NotFound desc = could not find container \"1993a5b26e2bf319d7f5b3ea0869e4c5daa4cfea3eb347f8472fd8e4dd3fe718\": container with ID starting with 1993a5b26e2bf319d7f5b3ea0869e4c5daa4cfea3eb347f8472fd8e4dd3fe718 not found: ID does not exist" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.119495 4783 scope.go:117] "RemoveContainer" containerID="341353603d3543f03e4712085bad09cb28ab70bdc64398275ec4e78fbe91b753" Jan 31 09:20:58 crc kubenswrapper[4783]: E0131 09:20:58.119869 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"341353603d3543f03e4712085bad09cb28ab70bdc64398275ec4e78fbe91b753\": container with ID starting with 341353603d3543f03e4712085bad09cb28ab70bdc64398275ec4e78fbe91b753 not found: ID does not exist" containerID="341353603d3543f03e4712085bad09cb28ab70bdc64398275ec4e78fbe91b753" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.119914 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341353603d3543f03e4712085bad09cb28ab70bdc64398275ec4e78fbe91b753"} err="failed to get container status \"341353603d3543f03e4712085bad09cb28ab70bdc64398275ec4e78fbe91b753\": rpc error: code = NotFound desc = could not find container \"341353603d3543f03e4712085bad09cb28ab70bdc64398275ec4e78fbe91b753\": container with ID starting with 341353603d3543f03e4712085bad09cb28ab70bdc64398275ec4e78fbe91b753 not found: ID does not exist" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.119950 4783 scope.go:117] "RemoveContainer" containerID="1bc1a13abc159d9a482463a4657c2c98b0c2cff2fa7152f086f5273895cc0538" Jan 31 09:20:58 crc kubenswrapper[4783]: E0131 09:20:58.120333 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bc1a13abc159d9a482463a4657c2c98b0c2cff2fa7152f086f5273895cc0538\": container with ID starting with 1bc1a13abc159d9a482463a4657c2c98b0c2cff2fa7152f086f5273895cc0538 not found: ID does not exist" containerID="1bc1a13abc159d9a482463a4657c2c98b0c2cff2fa7152f086f5273895cc0538" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.120479 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bc1a13abc159d9a482463a4657c2c98b0c2cff2fa7152f086f5273895cc0538"} err="failed to get container status \"1bc1a13abc159d9a482463a4657c2c98b0c2cff2fa7152f086f5273895cc0538\": rpc error: code = NotFound desc = could not find container \"1bc1a13abc159d9a482463a4657c2c98b0c2cff2fa7152f086f5273895cc0538\": container with ID starting with 1bc1a13abc159d9a482463a4657c2c98b0c2cff2fa7152f086f5273895cc0538 not found: ID does not exist" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.120605 4783 scope.go:117] "RemoveContainer" containerID="ba6765c2eb8c111768ee94c4b9ea61a597f791dc47a2183ff754f80bc760c583" Jan 31 09:20:58 crc kubenswrapper[4783]: E0131 09:20:58.121037 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba6765c2eb8c111768ee94c4b9ea61a597f791dc47a2183ff754f80bc760c583\": container with ID starting with ba6765c2eb8c111768ee94c4b9ea61a597f791dc47a2183ff754f80bc760c583 not found: ID does not exist" containerID="ba6765c2eb8c111768ee94c4b9ea61a597f791dc47a2183ff754f80bc760c583" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.121081 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba6765c2eb8c111768ee94c4b9ea61a597f791dc47a2183ff754f80bc760c583"} err="failed to get container status \"ba6765c2eb8c111768ee94c4b9ea61a597f791dc47a2183ff754f80bc760c583\": rpc error: code = NotFound desc = could not find container \"ba6765c2eb8c111768ee94c4b9ea61a597f791dc47a2183ff754f80bc760c583\": container with ID starting with ba6765c2eb8c111768ee94c4b9ea61a597f791dc47a2183ff754f80bc760c583 not found: ID does not exist" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.364362 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.373601 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.385860 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:58 crc kubenswrapper[4783]: E0131 09:20:58.386306 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25966a5-bd6e-458d-b002-130480d742a5" containerName="sg-core" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.386325 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25966a5-bd6e-458d-b002-130480d742a5" containerName="sg-core" Jan 31 09:20:58 crc kubenswrapper[4783]: E0131 09:20:58.386351 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25966a5-bd6e-458d-b002-130480d742a5" containerName="proxy-httpd" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.386358 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25966a5-bd6e-458d-b002-130480d742a5" containerName="proxy-httpd" Jan 31 09:20:58 crc kubenswrapper[4783]: E0131 09:20:58.386377 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25966a5-bd6e-458d-b002-130480d742a5" containerName="ceilometer-notification-agent" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.386383 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25966a5-bd6e-458d-b002-130480d742a5" containerName="ceilometer-notification-agent" Jan 31 09:20:58 crc kubenswrapper[4783]: E0131 09:20:58.386395 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25966a5-bd6e-458d-b002-130480d742a5" containerName="ceilometer-central-agent" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.386401 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25966a5-bd6e-458d-b002-130480d742a5" containerName="ceilometer-central-agent" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.386592 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25966a5-bd6e-458d-b002-130480d742a5" containerName="ceilometer-central-agent" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.386608 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25966a5-bd6e-458d-b002-130480d742a5" containerName="proxy-httpd" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.386619 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25966a5-bd6e-458d-b002-130480d742a5" containerName="ceilometer-notification-agent" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.386630 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25966a5-bd6e-458d-b002-130480d742a5" containerName="sg-core" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.388580 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.390876 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.391317 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.391597 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.396783 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.396831 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-config-data\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.396855 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.396982 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.397047 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5be663b7-6190-461d-aad1-b1c285c5dffa-log-httpd\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.397107 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgpjk\" (UniqueName: \"kubernetes.io/projected/5be663b7-6190-461d-aad1-b1c285c5dffa-kube-api-access-rgpjk\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.397262 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5be663b7-6190-461d-aad1-b1c285c5dffa-run-httpd\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.397432 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-scripts\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.405517 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.498221 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.498273 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-config-data\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.498296 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.498319 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.498345 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5be663b7-6190-461d-aad1-b1c285c5dffa-log-httpd\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.498369 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgpjk\" (UniqueName: \"kubernetes.io/projected/5be663b7-6190-461d-aad1-b1c285c5dffa-kube-api-access-rgpjk\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.498412 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5be663b7-6190-461d-aad1-b1c285c5dffa-run-httpd\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.498464 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-scripts\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.500386 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5be663b7-6190-461d-aad1-b1c285c5dffa-run-httpd\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.500471 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5be663b7-6190-461d-aad1-b1c285c5dffa-log-httpd\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.502586 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-scripts\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.503448 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.504010 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.506384 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.506499 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-config-data\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.513056 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgpjk\" (UniqueName: \"kubernetes.io/projected/5be663b7-6190-461d-aad1-b1c285c5dffa-kube-api-access-rgpjk\") pod \"ceilometer-0\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " pod="openstack/ceilometer-0" Jan 31 09:20:58 crc kubenswrapper[4783]: I0131 09:20:58.701680 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:20:59 crc kubenswrapper[4783]: I0131 09:20:59.123202 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:20:59 crc kubenswrapper[4783]: I0131 09:20:59.324718 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 31 09:20:59 crc kubenswrapper[4783]: I0131 09:20:59.657860 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a25966a5-bd6e-458d-b002-130480d742a5" path="/var/lib/kubelet/pods/a25966a5-bd6e-458d-b002-130480d742a5/volumes" Jan 31 09:21:00 crc kubenswrapper[4783]: I0131 09:21:00.049980 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5be663b7-6190-461d-aad1-b1c285c5dffa","Type":"ContainerStarted","Data":"489b82202105ab65bc850694eec75074736ea80da43a21a7df6bed411ac105ee"} Jan 31 09:21:00 crc kubenswrapper[4783]: I0131 09:21:00.050123 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5be663b7-6190-461d-aad1-b1c285c5dffa","Type":"ContainerStarted","Data":"6f76376471cf3eeac9a73074e4c23ebadd86b1da4ecfd1b35fc7545fd22867d7"} Jan 31 09:21:01 crc kubenswrapper[4783]: I0131 09:21:01.063065 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5be663b7-6190-461d-aad1-b1c285c5dffa","Type":"ContainerStarted","Data":"ff51223e8a63c87de5a9a645b9f0673f798d15a5c0f79a52e0d4bdd2ecfdd810"} Jan 31 09:21:01 crc kubenswrapper[4783]: I0131 09:21:01.520901 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 31 09:21:01 crc kubenswrapper[4783]: I0131 09:21:01.521418 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 31 09:21:01 crc kubenswrapper[4783]: I0131 09:21:01.523401 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 31 09:21:01 crc kubenswrapper[4783]: I0131 09:21:01.523459 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.074488 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5be663b7-6190-461d-aad1-b1c285c5dffa","Type":"ContainerStarted","Data":"fe6cf83b82351b0349144db7c6abc319736f851997900053ff240580877dd0ca"} Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.075462 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.078421 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.214073 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-c7hpn"] Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.215773 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.232465 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-c7hpn"] Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.306905 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-dns-svc\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.307006 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-config\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.307141 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.307364 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6zb5\" (UniqueName: \"kubernetes.io/projected/6c8f0127-fb81-4060-8fc5-e12eab702218-kube-api-access-p6zb5\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.307458 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.307720 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.410347 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-config\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.410483 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.410598 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6zb5\" (UniqueName: \"kubernetes.io/projected/6c8f0127-fb81-4060-8fc5-e12eab702218-kube-api-access-p6zb5\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.410630 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.410685 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.410718 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-dns-svc\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.411524 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-config\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.411659 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-dns-svc\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.412300 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.412634 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.412945 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.450883 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6zb5\" (UniqueName: \"kubernetes.io/projected/6c8f0127-fb81-4060-8fc5-e12eab702218-kube-api-access-p6zb5\") pod \"dnsmasq-dns-5ddd577785-c7hpn\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:02 crc kubenswrapper[4783]: I0131 09:21:02.531706 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:03 crc kubenswrapper[4783]: I0131 09:21:03.022857 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-c7hpn"] Jan 31 09:21:03 crc kubenswrapper[4783]: I0131 09:21:03.083986 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" event={"ID":"6c8f0127-fb81-4060-8fc5-e12eab702218","Type":"ContainerStarted","Data":"3478164d633f631ea8d7f237c012ee310500242f92dd7fb929bba3731a7eb9c9"} Jan 31 09:21:04 crc kubenswrapper[4783]: I0131 09:21:04.004202 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:21:04 crc kubenswrapper[4783]: I0131 09:21:04.094226 4783 generic.go:334] "Generic (PLEG): container finished" podID="6c8f0127-fb81-4060-8fc5-e12eab702218" containerID="1871cc6dc85787b0822a5ada75ec97544e1284ccf9f9db8eb7de37ac82024e20" exitCode=0 Jan 31 09:21:04 crc kubenswrapper[4783]: I0131 09:21:04.094311 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" event={"ID":"6c8f0127-fb81-4060-8fc5-e12eab702218","Type":"ContainerDied","Data":"1871cc6dc85787b0822a5ada75ec97544e1284ccf9f9db8eb7de37ac82024e20"} Jan 31 09:21:04 crc kubenswrapper[4783]: I0131 09:21:04.101357 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5be663b7-6190-461d-aad1-b1c285c5dffa","Type":"ContainerStarted","Data":"b8f1bc0fea0acf0b6281470f57fde1a69f28ff62f2f3296a576edd20de776e3e"} Jan 31 09:21:04 crc kubenswrapper[4783]: I0131 09:21:04.109291 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 31 09:21:04 crc kubenswrapper[4783]: I0131 09:21:04.173064 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.118247807 podStartE2EDuration="6.173035298s" podCreationTimestamp="2026-01-31 09:20:58 +0000 UTC" firstStartedPulling="2026-01-31 09:20:59.131588023 +0000 UTC m=+969.800271481" lastFinishedPulling="2026-01-31 09:21:03.186375504 +0000 UTC m=+973.855058972" observedRunningTime="2026-01-31 09:21:04.139425597 +0000 UTC m=+974.808109064" watchObservedRunningTime="2026-01-31 09:21:04.173035298 +0000 UTC m=+974.841718767" Jan 31 09:21:04 crc kubenswrapper[4783]: I0131 09:21:04.532824 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:21:05 crc kubenswrapper[4783]: E0131 09:21:05.055632 4783 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee339729_83ff_4126_8594_4ad2894aca05.slice/crio-9095f7d2e273de689d0f47b0748af8dcaba7aca0af9c498e87cb61f1be42a4e4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce903588_41a7_4443_81a3_3d2db239c3a5.slice/crio-20836eb91825f0846a079e37cb950df18909d02be34837774763ac2573cf1c6c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee339729_83ff_4126_8594_4ad2894aca05.slice/crio-conmon-9095f7d2e273de689d0f47b0748af8dcaba7aca0af9c498e87cb61f1be42a4e4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce903588_41a7_4443_81a3_3d2db239c3a5.slice/crio-conmon-20836eb91825f0846a079e37cb950df18909d02be34837774763ac2573cf1c6c.scope\": RecentStats: unable to find data in memory cache]" Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.121960 4783 generic.go:334] "Generic (PLEG): container finished" podID="ee339729-83ff-4126-8594-4ad2894aca05" containerID="9095f7d2e273de689d0f47b0748af8dcaba7aca0af9c498e87cb61f1be42a4e4" exitCode=137 Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.122064 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee339729-83ff-4126-8594-4ad2894aca05","Type":"ContainerDied","Data":"9095f7d2e273de689d0f47b0748af8dcaba7aca0af9c498e87cb61f1be42a4e4"} Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.123408 4783 generic.go:334] "Generic (PLEG): container finished" podID="ce903588-41a7-4443-81a3-3d2db239c3a5" containerID="20836eb91825f0846a079e37cb950df18909d02be34837774763ac2573cf1c6c" exitCode=137 Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.123461 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ce903588-41a7-4443-81a3-3d2db239c3a5","Type":"ContainerDied","Data":"20836eb91825f0846a079e37cb950df18909d02be34837774763ac2573cf1c6c"} Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.125815 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="95349f97-4f73-4e01-9e3c-e7c61fee8f5b" containerName="nova-api-log" containerID="cri-o://ef264af2a9d87194f3f2c9d96525862e3f585a640d13ba37ee8212eabb41a4ee" gracePeriod=30 Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.126063 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" event={"ID":"6c8f0127-fb81-4060-8fc5-e12eab702218","Type":"ContainerStarted","Data":"49d1fd28ed1a6a6b754b44c5b1c4e34681bfe9893dcccfcdce69e5dc37d8d5bc"} Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.126265 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerName="ceilometer-central-agent" containerID="cri-o://489b82202105ab65bc850694eec75074736ea80da43a21a7df6bed411ac105ee" gracePeriod=30 Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.126589 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="95349f97-4f73-4e01-9e3c-e7c61fee8f5b" containerName="nova-api-api" containerID="cri-o://bdcbed1ab20606fd9a478320b70ca67ce7f4c4d4096cbc2ca299a71d029c9141" gracePeriod=30 Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.126664 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerName="proxy-httpd" containerID="cri-o://b8f1bc0fea0acf0b6281470f57fde1a69f28ff62f2f3296a576edd20de776e3e" gracePeriod=30 Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.126705 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerName="sg-core" containerID="cri-o://fe6cf83b82351b0349144db7c6abc319736f851997900053ff240580877dd0ca" gracePeriod=30 Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.126740 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerName="ceilometer-notification-agent" containerID="cri-o://ff51223e8a63c87de5a9a645b9f0673f798d15a5c0f79a52e0d4bdd2ecfdd810" gracePeriod=30 Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.126887 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.150337 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" podStartSLOduration=3.150320981 podStartE2EDuration="3.150320981s" podCreationTimestamp="2026-01-31 09:21:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:21:05.146210261 +0000 UTC m=+975.814893730" watchObservedRunningTime="2026-01-31 09:21:05.150320981 +0000 UTC m=+975.819004449" Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.340887 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.376115 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee339729-83ff-4126-8594-4ad2894aca05-combined-ca-bundle\") pod \"ee339729-83ff-4126-8594-4ad2894aca05\" (UID: \"ee339729-83ff-4126-8594-4ad2894aca05\") " Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.376221 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee339729-83ff-4126-8594-4ad2894aca05-config-data\") pod \"ee339729-83ff-4126-8594-4ad2894aca05\" (UID: \"ee339729-83ff-4126-8594-4ad2894aca05\") " Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.376276 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee339729-83ff-4126-8594-4ad2894aca05-logs\") pod \"ee339729-83ff-4126-8594-4ad2894aca05\" (UID: \"ee339729-83ff-4126-8594-4ad2894aca05\") " Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.376536 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brks5\" (UniqueName: \"kubernetes.io/projected/ee339729-83ff-4126-8594-4ad2894aca05-kube-api-access-brks5\") pod \"ee339729-83ff-4126-8594-4ad2894aca05\" (UID: \"ee339729-83ff-4126-8594-4ad2894aca05\") " Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.377662 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee339729-83ff-4126-8594-4ad2894aca05-logs" (OuterVolumeSpecName: "logs") pod "ee339729-83ff-4126-8594-4ad2894aca05" (UID: "ee339729-83ff-4126-8594-4ad2894aca05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.383218 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee339729-83ff-4126-8594-4ad2894aca05-kube-api-access-brks5" (OuterVolumeSpecName: "kube-api-access-brks5") pod "ee339729-83ff-4126-8594-4ad2894aca05" (UID: "ee339729-83ff-4126-8594-4ad2894aca05"). InnerVolumeSpecName "kube-api-access-brks5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.403358 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee339729-83ff-4126-8594-4ad2894aca05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee339729-83ff-4126-8594-4ad2894aca05" (UID: "ee339729-83ff-4126-8594-4ad2894aca05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.413253 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee339729-83ff-4126-8594-4ad2894aca05-config-data" (OuterVolumeSpecName: "config-data") pod "ee339729-83ff-4126-8594-4ad2894aca05" (UID: "ee339729-83ff-4126-8594-4ad2894aca05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.478816 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee339729-83ff-4126-8594-4ad2894aca05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.478849 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee339729-83ff-4126-8594-4ad2894aca05-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.478859 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee339729-83ff-4126-8594-4ad2894aca05-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.478869 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brks5\" (UniqueName: \"kubernetes.io/projected/ee339729-83ff-4126-8594-4ad2894aca05-kube-api-access-brks5\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.632309 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.682789 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce903588-41a7-4443-81a3-3d2db239c3a5-config-data\") pod \"ce903588-41a7-4443-81a3-3d2db239c3a5\" (UID: \"ce903588-41a7-4443-81a3-3d2db239c3a5\") " Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.682941 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce903588-41a7-4443-81a3-3d2db239c3a5-combined-ca-bundle\") pod \"ce903588-41a7-4443-81a3-3d2db239c3a5\" (UID: \"ce903588-41a7-4443-81a3-3d2db239c3a5\") " Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.682981 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7hdv\" (UniqueName: \"kubernetes.io/projected/ce903588-41a7-4443-81a3-3d2db239c3a5-kube-api-access-l7hdv\") pod \"ce903588-41a7-4443-81a3-3d2db239c3a5\" (UID: \"ce903588-41a7-4443-81a3-3d2db239c3a5\") " Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.691368 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce903588-41a7-4443-81a3-3d2db239c3a5-kube-api-access-l7hdv" (OuterVolumeSpecName: "kube-api-access-l7hdv") pod "ce903588-41a7-4443-81a3-3d2db239c3a5" (UID: "ce903588-41a7-4443-81a3-3d2db239c3a5"). InnerVolumeSpecName "kube-api-access-l7hdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:05 crc kubenswrapper[4783]: E0131 09:21:05.710618 4783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce903588-41a7-4443-81a3-3d2db239c3a5-config-data podName:ce903588-41a7-4443-81a3-3d2db239c3a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:21:06.210547542 +0000 UTC m=+976.879231010 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/ce903588-41a7-4443-81a3-3d2db239c3a5-config-data") pod "ce903588-41a7-4443-81a3-3d2db239c3a5" (UID: "ce903588-41a7-4443-81a3-3d2db239c3a5") : error deleting /var/lib/kubelet/pods/ce903588-41a7-4443-81a3-3d2db239c3a5/volume-subpaths: remove /var/lib/kubelet/pods/ce903588-41a7-4443-81a3-3d2db239c3a5/volume-subpaths: no such file or directory Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.713319 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce903588-41a7-4443-81a3-3d2db239c3a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce903588-41a7-4443-81a3-3d2db239c3a5" (UID: "ce903588-41a7-4443-81a3-3d2db239c3a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.785475 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7hdv\" (UniqueName: \"kubernetes.io/projected/ce903588-41a7-4443-81a3-3d2db239c3a5-kube-api-access-l7hdv\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:05 crc kubenswrapper[4783]: I0131 09:21:05.785604 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce903588-41a7-4443-81a3-3d2db239c3a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.154697 4783 generic.go:334] "Generic (PLEG): container finished" podID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerID="b8f1bc0fea0acf0b6281470f57fde1a69f28ff62f2f3296a576edd20de776e3e" exitCode=0 Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.154752 4783 generic.go:334] "Generic (PLEG): container finished" podID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerID="fe6cf83b82351b0349144db7c6abc319736f851997900053ff240580877dd0ca" exitCode=2 Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.154760 4783 generic.go:334] "Generic (PLEG): container finished" podID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerID="ff51223e8a63c87de5a9a645b9f0673f798d15a5c0f79a52e0d4bdd2ecfdd810" exitCode=0 Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.154768 4783 generic.go:334] "Generic (PLEG): container finished" podID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerID="489b82202105ab65bc850694eec75074736ea80da43a21a7df6bed411ac105ee" exitCode=0 Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.154786 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5be663b7-6190-461d-aad1-b1c285c5dffa","Type":"ContainerDied","Data":"b8f1bc0fea0acf0b6281470f57fde1a69f28ff62f2f3296a576edd20de776e3e"} Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.154863 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5be663b7-6190-461d-aad1-b1c285c5dffa","Type":"ContainerDied","Data":"fe6cf83b82351b0349144db7c6abc319736f851997900053ff240580877dd0ca"} Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.154877 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5be663b7-6190-461d-aad1-b1c285c5dffa","Type":"ContainerDied","Data":"ff51223e8a63c87de5a9a645b9f0673f798d15a5c0f79a52e0d4bdd2ecfdd810"} Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.154887 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5be663b7-6190-461d-aad1-b1c285c5dffa","Type":"ContainerDied","Data":"489b82202105ab65bc850694eec75074736ea80da43a21a7df6bed411ac105ee"} Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.157614 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.157652 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ce903588-41a7-4443-81a3-3d2db239c3a5","Type":"ContainerDied","Data":"aa2981056306d7e5e079b0743bc8c03db85b5ff6b068b0e428a7ebcfdd056961"} Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.157721 4783 scope.go:117] "RemoveContainer" containerID="20836eb91825f0846a079e37cb950df18909d02be34837774763ac2573cf1c6c" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.160512 4783 generic.go:334] "Generic (PLEG): container finished" podID="95349f97-4f73-4e01-9e3c-e7c61fee8f5b" containerID="ef264af2a9d87194f3f2c9d96525862e3f585a640d13ba37ee8212eabb41a4ee" exitCode=143 Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.160565 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95349f97-4f73-4e01-9e3c-e7c61fee8f5b","Type":"ContainerDied","Data":"ef264af2a9d87194f3f2c9d96525862e3f585a640d13ba37ee8212eabb41a4ee"} Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.163651 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.164116 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee339729-83ff-4126-8594-4ad2894aca05","Type":"ContainerDied","Data":"103cfdd0e4039f0a397367979358da96445a294837f95596de04cc415607f336"} Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.187577 4783 scope.go:117] "RemoveContainer" containerID="9095f7d2e273de689d0f47b0748af8dcaba7aca0af9c498e87cb61f1be42a4e4" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.196094 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.205204 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.210840 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:21:06 crc kubenswrapper[4783]: E0131 09:21:06.211339 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee339729-83ff-4126-8594-4ad2894aca05" containerName="nova-metadata-metadata" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.211362 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee339729-83ff-4126-8594-4ad2894aca05" containerName="nova-metadata-metadata" Jan 31 09:21:06 crc kubenswrapper[4783]: E0131 09:21:06.211376 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee339729-83ff-4126-8594-4ad2894aca05" containerName="nova-metadata-log" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.211385 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee339729-83ff-4126-8594-4ad2894aca05" containerName="nova-metadata-log" Jan 31 09:21:06 crc kubenswrapper[4783]: E0131 09:21:06.211449 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce903588-41a7-4443-81a3-3d2db239c3a5" containerName="nova-cell1-novncproxy-novncproxy" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.211455 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce903588-41a7-4443-81a3-3d2db239c3a5" containerName="nova-cell1-novncproxy-novncproxy" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.212319 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee339729-83ff-4126-8594-4ad2894aca05" containerName="nova-metadata-metadata" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.212351 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee339729-83ff-4126-8594-4ad2894aca05" containerName="nova-metadata-log" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.212373 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce903588-41a7-4443-81a3-3d2db239c3a5" containerName="nova-cell1-novncproxy-novncproxy" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.213753 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.218214 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.218631 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.231614 4783 scope.go:117] "RemoveContainer" containerID="f8398da06cfb5ed935a264626e50ceae9fe2e4e1cc415cf55ad76b9d5d64ca04" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.231745 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.298785 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce903588-41a7-4443-81a3-3d2db239c3a5-config-data\") pod \"ce903588-41a7-4443-81a3-3d2db239c3a5\" (UID: \"ce903588-41a7-4443-81a3-3d2db239c3a5\") " Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.299351 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.299578 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-config-data\") pod \"nova-metadata-0\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.299645 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/031520bc-e4c4-4736-9483-aea3bfcef57e-logs\") pod \"nova-metadata-0\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.299711 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7bdf\" (UniqueName: \"kubernetes.io/projected/031520bc-e4c4-4736-9483-aea3bfcef57e-kube-api-access-h7bdf\") pod \"nova-metadata-0\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.299952 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.303627 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce903588-41a7-4443-81a3-3d2db239c3a5-config-data" (OuterVolumeSpecName: "config-data") pod "ce903588-41a7-4443-81a3-3d2db239c3a5" (UID: "ce903588-41a7-4443-81a3-3d2db239c3a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.346382 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.400900 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-scripts\") pod \"5be663b7-6190-461d-aad1-b1c285c5dffa\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.401074 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgpjk\" (UniqueName: \"kubernetes.io/projected/5be663b7-6190-461d-aad1-b1c285c5dffa-kube-api-access-rgpjk\") pod \"5be663b7-6190-461d-aad1-b1c285c5dffa\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.401231 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-config-data\") pod \"5be663b7-6190-461d-aad1-b1c285c5dffa\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.401261 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5be663b7-6190-461d-aad1-b1c285c5dffa-run-httpd\") pod \"5be663b7-6190-461d-aad1-b1c285c5dffa\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.401442 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-sg-core-conf-yaml\") pod \"5be663b7-6190-461d-aad1-b1c285c5dffa\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.401475 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-ceilometer-tls-certs\") pod \"5be663b7-6190-461d-aad1-b1c285c5dffa\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.401563 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5be663b7-6190-461d-aad1-b1c285c5dffa-log-httpd\") pod \"5be663b7-6190-461d-aad1-b1c285c5dffa\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.401602 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-combined-ca-bundle\") pod \"5be663b7-6190-461d-aad1-b1c285c5dffa\" (UID: \"5be663b7-6190-461d-aad1-b1c285c5dffa\") " Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.401880 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5be663b7-6190-461d-aad1-b1c285c5dffa-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5be663b7-6190-461d-aad1-b1c285c5dffa" (UID: "5be663b7-6190-461d-aad1-b1c285c5dffa"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.401966 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-config-data\") pod \"nova-metadata-0\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.402019 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/031520bc-e4c4-4736-9483-aea3bfcef57e-logs\") pod \"nova-metadata-0\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.402057 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7bdf\" (UniqueName: \"kubernetes.io/projected/031520bc-e4c4-4736-9483-aea3bfcef57e-kube-api-access-h7bdf\") pod \"nova-metadata-0\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.402088 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.402181 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.402337 4783 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5be663b7-6190-461d-aad1-b1c285c5dffa-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.402354 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce903588-41a7-4443-81a3-3d2db239c3a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.402500 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5be663b7-6190-461d-aad1-b1c285c5dffa-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5be663b7-6190-461d-aad1-b1c285c5dffa" (UID: "5be663b7-6190-461d-aad1-b1c285c5dffa"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.402807 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/031520bc-e4c4-4736-9483-aea3bfcef57e-logs\") pod \"nova-metadata-0\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.405885 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-scripts" (OuterVolumeSpecName: "scripts") pod "5be663b7-6190-461d-aad1-b1c285c5dffa" (UID: "5be663b7-6190-461d-aad1-b1c285c5dffa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.406233 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5be663b7-6190-461d-aad1-b1c285c5dffa-kube-api-access-rgpjk" (OuterVolumeSpecName: "kube-api-access-rgpjk") pod "5be663b7-6190-461d-aad1-b1c285c5dffa" (UID: "5be663b7-6190-461d-aad1-b1c285c5dffa"). InnerVolumeSpecName "kube-api-access-rgpjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.408007 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.412964 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.417128 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-config-data\") pod \"nova-metadata-0\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.428295 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7bdf\" (UniqueName: \"kubernetes.io/projected/031520bc-e4c4-4736-9483-aea3bfcef57e-kube-api-access-h7bdf\") pod \"nova-metadata-0\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.439383 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5be663b7-6190-461d-aad1-b1c285c5dffa" (UID: "5be663b7-6190-461d-aad1-b1c285c5dffa"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.448760 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5be663b7-6190-461d-aad1-b1c285c5dffa" (UID: "5be663b7-6190-461d-aad1-b1c285c5dffa"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.489278 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5be663b7-6190-461d-aad1-b1c285c5dffa" (UID: "5be663b7-6190-461d-aad1-b1c285c5dffa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.497745 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-config-data" (OuterVolumeSpecName: "config-data") pod "5be663b7-6190-461d-aad1-b1c285c5dffa" (UID: "5be663b7-6190-461d-aad1-b1c285c5dffa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.504810 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgpjk\" (UniqueName: \"kubernetes.io/projected/5be663b7-6190-461d-aad1-b1c285c5dffa-kube-api-access-rgpjk\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.504840 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.504852 4783 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.504863 4783 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.504876 4783 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5be663b7-6190-461d-aad1-b1c285c5dffa-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.504887 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.504897 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5be663b7-6190-461d-aad1-b1c285c5dffa-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.506230 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.522634 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.530553 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:21:06 crc kubenswrapper[4783]: E0131 09:21:06.531112 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerName="ceilometer-central-agent" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.531134 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerName="ceilometer-central-agent" Jan 31 09:21:06 crc kubenswrapper[4783]: E0131 09:21:06.531174 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerName="proxy-httpd" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.531182 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerName="proxy-httpd" Jan 31 09:21:06 crc kubenswrapper[4783]: E0131 09:21:06.531201 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerName="ceilometer-notification-agent" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.531209 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerName="ceilometer-notification-agent" Jan 31 09:21:06 crc kubenswrapper[4783]: E0131 09:21:06.531225 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerName="sg-core" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.531231 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerName="sg-core" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.531467 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerName="proxy-httpd" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.531488 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerName="ceilometer-central-agent" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.531501 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerName="sg-core" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.531521 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" containerName="ceilometer-notification-agent" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.532389 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.535881 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.536975 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.537305 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.537613 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.606016 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.607408 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4655d42d-6876-4b07-bde2-d8a70c62018d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4655d42d-6876-4b07-bde2-d8a70c62018d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.607534 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4655d42d-6876-4b07-bde2-d8a70c62018d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4655d42d-6876-4b07-bde2-d8a70c62018d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.607617 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4655d42d-6876-4b07-bde2-d8a70c62018d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4655d42d-6876-4b07-bde2-d8a70c62018d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.607650 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4655d42d-6876-4b07-bde2-d8a70c62018d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4655d42d-6876-4b07-bde2-d8a70c62018d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.607675 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w87pc\" (UniqueName: \"kubernetes.io/projected/4655d42d-6876-4b07-bde2-d8a70c62018d-kube-api-access-w87pc\") pod \"nova-cell1-novncproxy-0\" (UID: \"4655d42d-6876-4b07-bde2-d8a70c62018d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.710311 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w87pc\" (UniqueName: \"kubernetes.io/projected/4655d42d-6876-4b07-bde2-d8a70c62018d-kube-api-access-w87pc\") pod \"nova-cell1-novncproxy-0\" (UID: \"4655d42d-6876-4b07-bde2-d8a70c62018d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.710509 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4655d42d-6876-4b07-bde2-d8a70c62018d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4655d42d-6876-4b07-bde2-d8a70c62018d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.710618 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4655d42d-6876-4b07-bde2-d8a70c62018d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4655d42d-6876-4b07-bde2-d8a70c62018d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.710684 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4655d42d-6876-4b07-bde2-d8a70c62018d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4655d42d-6876-4b07-bde2-d8a70c62018d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.710716 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4655d42d-6876-4b07-bde2-d8a70c62018d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4655d42d-6876-4b07-bde2-d8a70c62018d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.715138 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4655d42d-6876-4b07-bde2-d8a70c62018d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4655d42d-6876-4b07-bde2-d8a70c62018d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.715196 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4655d42d-6876-4b07-bde2-d8a70c62018d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4655d42d-6876-4b07-bde2-d8a70c62018d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.715688 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4655d42d-6876-4b07-bde2-d8a70c62018d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4655d42d-6876-4b07-bde2-d8a70c62018d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.715892 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4655d42d-6876-4b07-bde2-d8a70c62018d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4655d42d-6876-4b07-bde2-d8a70c62018d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.726933 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w87pc\" (UniqueName: \"kubernetes.io/projected/4655d42d-6876-4b07-bde2-d8a70c62018d-kube-api-access-w87pc\") pod \"nova-cell1-novncproxy-0\" (UID: \"4655d42d-6876-4b07-bde2-d8a70c62018d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:06 crc kubenswrapper[4783]: I0131 09:21:06.851767 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.005973 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:21:07 crc kubenswrapper[4783]: W0131 09:21:07.012412 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod031520bc_e4c4_4736_9483_aea3bfcef57e.slice/crio-18ac7f8051a12577f354cb39ef14f7364e50a13f36a46f96f3943e2a1c28ca6f WatchSource:0}: Error finding container 18ac7f8051a12577f354cb39ef14f7364e50a13f36a46f96f3943e2a1c28ca6f: Status 404 returned error can't find the container with id 18ac7f8051a12577f354cb39ef14f7364e50a13f36a46f96f3943e2a1c28ca6f Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.183472 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5be663b7-6190-461d-aad1-b1c285c5dffa","Type":"ContainerDied","Data":"6f76376471cf3eeac9a73074e4c23ebadd86b1da4ecfd1b35fc7545fd22867d7"} Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.183814 4783 scope.go:117] "RemoveContainer" containerID="b8f1bc0fea0acf0b6281470f57fde1a69f28ff62f2f3296a576edd20de776e3e" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.183496 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.186792 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"031520bc-e4c4-4736-9483-aea3bfcef57e","Type":"ContainerStarted","Data":"18ac7f8051a12577f354cb39ef14f7364e50a13f36a46f96f3943e2a1c28ca6f"} Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.211198 4783 scope.go:117] "RemoveContainer" containerID="fe6cf83b82351b0349144db7c6abc319736f851997900053ff240580877dd0ca" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.222814 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.230904 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.249172 4783 scope.go:117] "RemoveContainer" containerID="ff51223e8a63c87de5a9a645b9f0673f798d15a5c0f79a52e0d4bdd2ecfdd810" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.249951 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.252424 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.256341 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.256410 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.258225 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.273993 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.276381 4783 scope.go:117] "RemoveContainer" containerID="489b82202105ab65bc850694eec75074736ea80da43a21a7df6bed411ac105ee" Jan 31 09:21:07 crc kubenswrapper[4783]: W0131 09:21:07.282750 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4655d42d_6876_4b07_bde2_d8a70c62018d.slice/crio-d39c72955596deb9a1993a55b9ee330f33547d2605640decf30f899315252d50 WatchSource:0}: Error finding container d39c72955596deb9a1993a55b9ee330f33547d2605640decf30f899315252d50: Status 404 returned error can't find the container with id d39c72955596deb9a1993a55b9ee330f33547d2605640decf30f899315252d50 Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.286876 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.437685 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.437749 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-config-data\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.437794 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.437843 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrjvm\" (UniqueName: \"kubernetes.io/projected/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-kube-api-access-wrjvm\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.437878 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-scripts\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.437907 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-run-httpd\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.437988 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-log-httpd\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.438024 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.539821 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrjvm\" (UniqueName: \"kubernetes.io/projected/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-kube-api-access-wrjvm\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.539881 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-scripts\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.539916 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-run-httpd\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.539999 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-log-httpd\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.540034 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.540079 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.540103 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-config-data\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.540130 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.540482 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-run-httpd\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.546343 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-log-httpd\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.547046 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-config-data\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.548024 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.548051 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.548093 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.548114 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-scripts\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.558280 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrjvm\" (UniqueName: \"kubernetes.io/projected/4c9fe6f4-d5e6-4f59-8803-3e889c863d6c-kube-api-access-wrjvm\") pod \"ceilometer-0\" (UID: \"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c\") " pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.586240 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.660326 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5be663b7-6190-461d-aad1-b1c285c5dffa" path="/var/lib/kubelet/pods/5be663b7-6190-461d-aad1-b1c285c5dffa/volumes" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.661435 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce903588-41a7-4443-81a3-3d2db239c3a5" path="/var/lib/kubelet/pods/ce903588-41a7-4443-81a3-3d2db239c3a5/volumes" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.662622 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee339729-83ff-4126-8594-4ad2894aca05" path="/var/lib/kubelet/pods/ee339729-83ff-4126-8594-4ad2894aca05/volumes" Jan 31 09:21:07 crc kubenswrapper[4783]: I0131 09:21:07.987110 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.198398 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"031520bc-e4c4-4736-9483-aea3bfcef57e","Type":"ContainerStarted","Data":"f63c4c22d031cad4a7e7180fa6aa247d18773fa10f962cb472d560dfbcb5c32f"} Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.198695 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"031520bc-e4c4-4736-9483-aea3bfcef57e","Type":"ContainerStarted","Data":"84ae8b3ecd2c55cb136323564ee9dc4ea094da3adf561df449e9e567b11bcf32"} Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.200151 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4655d42d-6876-4b07-bde2-d8a70c62018d","Type":"ContainerStarted","Data":"e9251ca27db45bbc53dc4ff78a4bc9b3183cdebc91abb9c0453e7e6eab7be20e"} Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.200206 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4655d42d-6876-4b07-bde2-d8a70c62018d","Type":"ContainerStarted","Data":"d39c72955596deb9a1993a55b9ee330f33547d2605640decf30f899315252d50"} Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.201364 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c","Type":"ContainerStarted","Data":"c57ab5bccdfded0fcb015ef35d4e4bed183d0e09461dd0dc6e48064db24a1483"} Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.247195 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.2471701729999998 podStartE2EDuration="2.247170173s" podCreationTimestamp="2026-01-31 09:21:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:21:08.213897115 +0000 UTC m=+978.882580583" watchObservedRunningTime="2026-01-31 09:21:08.247170173 +0000 UTC m=+978.915853640" Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.258847 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.258822711 podStartE2EDuration="2.258822711s" podCreationTimestamp="2026-01-31 09:21:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:21:08.238813648 +0000 UTC m=+978.907497117" watchObservedRunningTime="2026-01-31 09:21:08.258822711 +0000 UTC m=+978.927506179" Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.621072 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.771964 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv4h7\" (UniqueName: \"kubernetes.io/projected/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-kube-api-access-nv4h7\") pod \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\" (UID: \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\") " Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.772433 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-logs\") pod \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\" (UID: \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\") " Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.772478 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-combined-ca-bundle\") pod \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\" (UID: \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\") " Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.772618 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-config-data\") pod \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\" (UID: \"95349f97-4f73-4e01-9e3c-e7c61fee8f5b\") " Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.772913 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-logs" (OuterVolumeSpecName: "logs") pod "95349f97-4f73-4e01-9e3c-e7c61fee8f5b" (UID: "95349f97-4f73-4e01-9e3c-e7c61fee8f5b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.773367 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.776735 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-kube-api-access-nv4h7" (OuterVolumeSpecName: "kube-api-access-nv4h7") pod "95349f97-4f73-4e01-9e3c-e7c61fee8f5b" (UID: "95349f97-4f73-4e01-9e3c-e7c61fee8f5b"). InnerVolumeSpecName "kube-api-access-nv4h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.799086 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95349f97-4f73-4e01-9e3c-e7c61fee8f5b" (UID: "95349f97-4f73-4e01-9e3c-e7c61fee8f5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.799782 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-config-data" (OuterVolumeSpecName: "config-data") pod "95349f97-4f73-4e01-9e3c-e7c61fee8f5b" (UID: "95349f97-4f73-4e01-9e3c-e7c61fee8f5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.876040 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.876074 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:08 crc kubenswrapper[4783]: I0131 09:21:08.876084 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv4h7\" (UniqueName: \"kubernetes.io/projected/95349f97-4f73-4e01-9e3c-e7c61fee8f5b-kube-api-access-nv4h7\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.214985 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c","Type":"ContainerStarted","Data":"04b05814dca04c304e4646db05085b28f57d6d965ad0e374d888929eb560225f"} Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.217557 4783 generic.go:334] "Generic (PLEG): container finished" podID="95349f97-4f73-4e01-9e3c-e7c61fee8f5b" containerID="bdcbed1ab20606fd9a478320b70ca67ce7f4c4d4096cbc2ca299a71d029c9141" exitCode=0 Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.218908 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.222239 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95349f97-4f73-4e01-9e3c-e7c61fee8f5b","Type":"ContainerDied","Data":"bdcbed1ab20606fd9a478320b70ca67ce7f4c4d4096cbc2ca299a71d029c9141"} Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.222319 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"95349f97-4f73-4e01-9e3c-e7c61fee8f5b","Type":"ContainerDied","Data":"fd83bcbd259213dba42fc9ff0f8faad08a42f76eb883bfcb16628dc37cc59c26"} Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.222342 4783 scope.go:117] "RemoveContainer" containerID="bdcbed1ab20606fd9a478320b70ca67ce7f4c4d4096cbc2ca299a71d029c9141" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.256065 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.270466 4783 scope.go:117] "RemoveContainer" containerID="ef264af2a9d87194f3f2c9d96525862e3f585a640d13ba37ee8212eabb41a4ee" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.275453 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.290738 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 31 09:21:09 crc kubenswrapper[4783]: E0131 09:21:09.291667 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95349f97-4f73-4e01-9e3c-e7c61fee8f5b" containerName="nova-api-log" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.291695 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="95349f97-4f73-4e01-9e3c-e7c61fee8f5b" containerName="nova-api-log" Jan 31 09:21:09 crc kubenswrapper[4783]: E0131 09:21:09.291742 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95349f97-4f73-4e01-9e3c-e7c61fee8f5b" containerName="nova-api-api" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.291751 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="95349f97-4f73-4e01-9e3c-e7c61fee8f5b" containerName="nova-api-api" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.291961 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="95349f97-4f73-4e01-9e3c-e7c61fee8f5b" containerName="nova-api-api" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.291986 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="95349f97-4f73-4e01-9e3c-e7c61fee8f5b" containerName="nova-api-log" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.293621 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.295227 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.295397 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.295659 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.305002 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.387536 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-logs\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.387611 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.387961 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-config-data\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.388260 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.388345 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-public-tls-certs\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.388539 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fq2j\" (UniqueName: \"kubernetes.io/projected/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-kube-api-access-6fq2j\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.421731 4783 scope.go:117] "RemoveContainer" containerID="bdcbed1ab20606fd9a478320b70ca67ce7f4c4d4096cbc2ca299a71d029c9141" Jan 31 09:21:09 crc kubenswrapper[4783]: E0131 09:21:09.422381 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdcbed1ab20606fd9a478320b70ca67ce7f4c4d4096cbc2ca299a71d029c9141\": container with ID starting with bdcbed1ab20606fd9a478320b70ca67ce7f4c4d4096cbc2ca299a71d029c9141 not found: ID does not exist" containerID="bdcbed1ab20606fd9a478320b70ca67ce7f4c4d4096cbc2ca299a71d029c9141" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.422418 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdcbed1ab20606fd9a478320b70ca67ce7f4c4d4096cbc2ca299a71d029c9141"} err="failed to get container status \"bdcbed1ab20606fd9a478320b70ca67ce7f4c4d4096cbc2ca299a71d029c9141\": rpc error: code = NotFound desc = could not find container \"bdcbed1ab20606fd9a478320b70ca67ce7f4c4d4096cbc2ca299a71d029c9141\": container with ID starting with bdcbed1ab20606fd9a478320b70ca67ce7f4c4d4096cbc2ca299a71d029c9141 not found: ID does not exist" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.422449 4783 scope.go:117] "RemoveContainer" containerID="ef264af2a9d87194f3f2c9d96525862e3f585a640d13ba37ee8212eabb41a4ee" Jan 31 09:21:09 crc kubenswrapper[4783]: E0131 09:21:09.422788 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef264af2a9d87194f3f2c9d96525862e3f585a640d13ba37ee8212eabb41a4ee\": container with ID starting with ef264af2a9d87194f3f2c9d96525862e3f585a640d13ba37ee8212eabb41a4ee not found: ID does not exist" containerID="ef264af2a9d87194f3f2c9d96525862e3f585a640d13ba37ee8212eabb41a4ee" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.422809 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef264af2a9d87194f3f2c9d96525862e3f585a640d13ba37ee8212eabb41a4ee"} err="failed to get container status \"ef264af2a9d87194f3f2c9d96525862e3f585a640d13ba37ee8212eabb41a4ee\": rpc error: code = NotFound desc = could not find container \"ef264af2a9d87194f3f2c9d96525862e3f585a640d13ba37ee8212eabb41a4ee\": container with ID starting with ef264af2a9d87194f3f2c9d96525862e3f585a640d13ba37ee8212eabb41a4ee not found: ID does not exist" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.490598 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.490668 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-public-tls-certs\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.490774 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fq2j\" (UniqueName: \"kubernetes.io/projected/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-kube-api-access-6fq2j\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.490824 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-logs\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.490846 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.491326 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-logs\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.492244 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-config-data\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.497728 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.497755 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.497901 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-public-tls-certs\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.498854 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-config-data\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.507029 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fq2j\" (UniqueName: \"kubernetes.io/projected/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-kube-api-access-6fq2j\") pod \"nova-api-0\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " pod="openstack/nova-api-0" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.657964 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95349f97-4f73-4e01-9e3c-e7c61fee8f5b" path="/var/lib/kubelet/pods/95349f97-4f73-4e01-9e3c-e7c61fee8f5b/volumes" Jan 31 09:21:09 crc kubenswrapper[4783]: I0131 09:21:09.728133 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:21:10 crc kubenswrapper[4783]: I0131 09:21:10.121851 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:21:10 crc kubenswrapper[4783]: W0131 09:21:10.131114 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2ce0fab_e057_4da3_90c7_58a3841d1fe4.slice/crio-c56efa8c98a5808269d6ef6dd6b4c2c58045d0a951f2cb9585f2d648e6688bb6 WatchSource:0}: Error finding container c56efa8c98a5808269d6ef6dd6b4c2c58045d0a951f2cb9585f2d648e6688bb6: Status 404 returned error can't find the container with id c56efa8c98a5808269d6ef6dd6b4c2c58045d0a951f2cb9585f2d648e6688bb6 Jan 31 09:21:10 crc kubenswrapper[4783]: I0131 09:21:10.235560 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2ce0fab-e057-4da3-90c7-58a3841d1fe4","Type":"ContainerStarted","Data":"c56efa8c98a5808269d6ef6dd6b4c2c58045d0a951f2cb9585f2d648e6688bb6"} Jan 31 09:21:10 crc kubenswrapper[4783]: I0131 09:21:10.238659 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c","Type":"ContainerStarted","Data":"19a8985006fcfd810bf68b1386892a58cdbe1b6b0dd0353a5882cb4e81f3f0b4"} Jan 31 09:21:11 crc kubenswrapper[4783]: I0131 09:21:11.249354 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2ce0fab-e057-4da3-90c7-58a3841d1fe4","Type":"ContainerStarted","Data":"78d97ece5312096faf53a734a6e9149a69cb584b46a8cc7b565e3efa41fe2fff"} Jan 31 09:21:11 crc kubenswrapper[4783]: I0131 09:21:11.249675 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2ce0fab-e057-4da3-90c7-58a3841d1fe4","Type":"ContainerStarted","Data":"7a579705aefbddc0a01ed1496084f3cb8ba66b87251734ee0d5c7b7c8d230666"} Jan 31 09:21:11 crc kubenswrapper[4783]: I0131 09:21:11.251401 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c","Type":"ContainerStarted","Data":"ac1b44508a3db65f1c81fa73505a599da3efc2469039979b7b8a3e50c7c0d489"} Jan 31 09:21:11 crc kubenswrapper[4783]: I0131 09:21:11.269063 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.269049851 podStartE2EDuration="2.269049851s" podCreationTimestamp="2026-01-31 09:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:21:11.264269809 +0000 UTC m=+981.932953278" watchObservedRunningTime="2026-01-31 09:21:11.269049851 +0000 UTC m=+981.937733318" Jan 31 09:21:11 crc kubenswrapper[4783]: I0131 09:21:11.606804 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 31 09:21:11 crc kubenswrapper[4783]: I0131 09:21:11.607078 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 31 09:21:11 crc kubenswrapper[4783]: I0131 09:21:11.853256 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:12 crc kubenswrapper[4783]: I0131 09:21:12.533297 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:21:12 crc kubenswrapper[4783]: I0131 09:21:12.609642 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9ljpj"] Jan 31 09:21:12 crc kubenswrapper[4783]: I0131 09:21:12.609920 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" podUID="0d965f76-ea10-4246-aeb2-014ba9f3fd65" containerName="dnsmasq-dns" containerID="cri-o://f88efe5e65c571ca614a8a301b5858cc10d34427e68154841a2b7dbc3eca84ae" gracePeriod=10 Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.062954 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.157520 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-ovsdbserver-sb\") pod \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.157568 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-config\") pod \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.157591 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhr7x\" (UniqueName: \"kubernetes.io/projected/0d965f76-ea10-4246-aeb2-014ba9f3fd65-kube-api-access-hhr7x\") pod \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.157655 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-dns-swift-storage-0\") pod \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.157709 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-ovsdbserver-nb\") pod \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.166309 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d965f76-ea10-4246-aeb2-014ba9f3fd65-kube-api-access-hhr7x" (OuterVolumeSpecName: "kube-api-access-hhr7x") pod "0d965f76-ea10-4246-aeb2-014ba9f3fd65" (UID: "0d965f76-ea10-4246-aeb2-014ba9f3fd65"). InnerVolumeSpecName "kube-api-access-hhr7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.197774 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-config" (OuterVolumeSpecName: "config") pod "0d965f76-ea10-4246-aeb2-014ba9f3fd65" (UID: "0d965f76-ea10-4246-aeb2-014ba9f3fd65"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.198693 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0d965f76-ea10-4246-aeb2-014ba9f3fd65" (UID: "0d965f76-ea10-4246-aeb2-014ba9f3fd65"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.199780 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0d965f76-ea10-4246-aeb2-014ba9f3fd65" (UID: "0d965f76-ea10-4246-aeb2-014ba9f3fd65"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.203602 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0d965f76-ea10-4246-aeb2-014ba9f3fd65" (UID: "0d965f76-ea10-4246-aeb2-014ba9f3fd65"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.259356 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-dns-svc\") pod \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\" (UID: \"0d965f76-ea10-4246-aeb2-014ba9f3fd65\") " Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.260732 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.260756 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhr7x\" (UniqueName: \"kubernetes.io/projected/0d965f76-ea10-4246-aeb2-014ba9f3fd65-kube-api-access-hhr7x\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.260769 4783 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.260778 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.260791 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.275665 4783 generic.go:334] "Generic (PLEG): container finished" podID="0d965f76-ea10-4246-aeb2-014ba9f3fd65" containerID="f88efe5e65c571ca614a8a301b5858cc10d34427e68154841a2b7dbc3eca84ae" exitCode=0 Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.275738 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" event={"ID":"0d965f76-ea10-4246-aeb2-014ba9f3fd65","Type":"ContainerDied","Data":"f88efe5e65c571ca614a8a301b5858cc10d34427e68154841a2b7dbc3eca84ae"} Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.275772 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" event={"ID":"0d965f76-ea10-4246-aeb2-014ba9f3fd65","Type":"ContainerDied","Data":"f9aaa7f823ec5a17abfa5a323cdf131d1fdbbe3fae48cf110ffdb6f2d6ac7aac"} Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.275794 4783 scope.go:117] "RemoveContainer" containerID="f88efe5e65c571ca614a8a301b5858cc10d34427e68154841a2b7dbc3eca84ae" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.275815 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-9ljpj" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.283229 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c9fe6f4-d5e6-4f59-8803-3e889c863d6c","Type":"ContainerStarted","Data":"3725a7a114358ea554746ee53141dadd9f7d32a75d4d9dffacf89aec707a3d86"} Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.283463 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.314883 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0d965f76-ea10-4246-aeb2-014ba9f3fd65" (UID: "0d965f76-ea10-4246-aeb2-014ba9f3fd65"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.344960 4783 scope.go:117] "RemoveContainer" containerID="91e93a6173cd47706391b6eca51d68985b99374cf8bf44c9e672c15487954ccc" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.365246 4783 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d965f76-ea10-4246-aeb2-014ba9f3fd65-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.365888 4783 scope.go:117] "RemoveContainer" containerID="f88efe5e65c571ca614a8a301b5858cc10d34427e68154841a2b7dbc3eca84ae" Jan 31 09:21:13 crc kubenswrapper[4783]: E0131 09:21:13.366221 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f88efe5e65c571ca614a8a301b5858cc10d34427e68154841a2b7dbc3eca84ae\": container with ID starting with f88efe5e65c571ca614a8a301b5858cc10d34427e68154841a2b7dbc3eca84ae not found: ID does not exist" containerID="f88efe5e65c571ca614a8a301b5858cc10d34427e68154841a2b7dbc3eca84ae" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.366261 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f88efe5e65c571ca614a8a301b5858cc10d34427e68154841a2b7dbc3eca84ae"} err="failed to get container status \"f88efe5e65c571ca614a8a301b5858cc10d34427e68154841a2b7dbc3eca84ae\": rpc error: code = NotFound desc = could not find container \"f88efe5e65c571ca614a8a301b5858cc10d34427e68154841a2b7dbc3eca84ae\": container with ID starting with f88efe5e65c571ca614a8a301b5858cc10d34427e68154841a2b7dbc3eca84ae not found: ID does not exist" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.366292 4783 scope.go:117] "RemoveContainer" containerID="91e93a6173cd47706391b6eca51d68985b99374cf8bf44c9e672c15487954ccc" Jan 31 09:21:13 crc kubenswrapper[4783]: E0131 09:21:13.366712 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91e93a6173cd47706391b6eca51d68985b99374cf8bf44c9e672c15487954ccc\": container with ID starting with 91e93a6173cd47706391b6eca51d68985b99374cf8bf44c9e672c15487954ccc not found: ID does not exist" containerID="91e93a6173cd47706391b6eca51d68985b99374cf8bf44c9e672c15487954ccc" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.366742 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e93a6173cd47706391b6eca51d68985b99374cf8bf44c9e672c15487954ccc"} err="failed to get container status \"91e93a6173cd47706391b6eca51d68985b99374cf8bf44c9e672c15487954ccc\": rpc error: code = NotFound desc = could not find container \"91e93a6173cd47706391b6eca51d68985b99374cf8bf44c9e672c15487954ccc\": container with ID starting with 91e93a6173cd47706391b6eca51d68985b99374cf8bf44c9e672c15487954ccc not found: ID does not exist" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.607610 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.529169068 podStartE2EDuration="6.607579862s" podCreationTimestamp="2026-01-31 09:21:07 +0000 UTC" firstStartedPulling="2026-01-31 09:21:07.990911608 +0000 UTC m=+978.659595077" lastFinishedPulling="2026-01-31 09:21:12.069322403 +0000 UTC m=+982.738005871" observedRunningTime="2026-01-31 09:21:13.304462337 +0000 UTC m=+983.973145805" watchObservedRunningTime="2026-01-31 09:21:13.607579862 +0000 UTC m=+984.276263329" Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.610208 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9ljpj"] Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.617839 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9ljpj"] Jan 31 09:21:13 crc kubenswrapper[4783]: I0131 09:21:13.655583 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d965f76-ea10-4246-aeb2-014ba9f3fd65" path="/var/lib/kubelet/pods/0d965f76-ea10-4246-aeb2-014ba9f3fd65/volumes" Jan 31 09:21:16 crc kubenswrapper[4783]: I0131 09:21:16.606873 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 31 09:21:16 crc kubenswrapper[4783]: I0131 09:21:16.607843 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 31 09:21:16 crc kubenswrapper[4783]: I0131 09:21:16.853186 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:16 crc kubenswrapper[4783]: I0131 09:21:16.877105 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.345616 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.541417 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-h5qh5"] Jan 31 09:21:17 crc kubenswrapper[4783]: E0131 09:21:17.544066 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d965f76-ea10-4246-aeb2-014ba9f3fd65" containerName="init" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.544097 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d965f76-ea10-4246-aeb2-014ba9f3fd65" containerName="init" Jan 31 09:21:17 crc kubenswrapper[4783]: E0131 09:21:17.544141 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d965f76-ea10-4246-aeb2-014ba9f3fd65" containerName="dnsmasq-dns" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.544150 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d965f76-ea10-4246-aeb2-014ba9f3fd65" containerName="dnsmasq-dns" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.545433 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d965f76-ea10-4246-aeb2-014ba9f3fd65" containerName="dnsmasq-dns" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.546344 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-h5qh5" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.549421 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.551595 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.566310 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-h5qh5"] Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.627538 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="031520bc-e4c4-4736-9483-aea3bfcef57e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.627611 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="031520bc-e4c4-4736-9483-aea3bfcef57e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.660409 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-h5qh5\" (UID: \"813d6a18-c983-4731-ae74-ca671d822949\") " pod="openstack/nova-cell1-cell-mapping-h5qh5" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.660485 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-scripts\") pod \"nova-cell1-cell-mapping-h5qh5\" (UID: \"813d6a18-c983-4731-ae74-ca671d822949\") " pod="openstack/nova-cell1-cell-mapping-h5qh5" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.660517 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf99c\" (UniqueName: \"kubernetes.io/projected/813d6a18-c983-4731-ae74-ca671d822949-kube-api-access-jf99c\") pod \"nova-cell1-cell-mapping-h5qh5\" (UID: \"813d6a18-c983-4731-ae74-ca671d822949\") " pod="openstack/nova-cell1-cell-mapping-h5qh5" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.661368 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-config-data\") pod \"nova-cell1-cell-mapping-h5qh5\" (UID: \"813d6a18-c983-4731-ae74-ca671d822949\") " pod="openstack/nova-cell1-cell-mapping-h5qh5" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.763035 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-config-data\") pod \"nova-cell1-cell-mapping-h5qh5\" (UID: \"813d6a18-c983-4731-ae74-ca671d822949\") " pod="openstack/nova-cell1-cell-mapping-h5qh5" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.764068 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-h5qh5\" (UID: \"813d6a18-c983-4731-ae74-ca671d822949\") " pod="openstack/nova-cell1-cell-mapping-h5qh5" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.764528 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-scripts\") pod \"nova-cell1-cell-mapping-h5qh5\" (UID: \"813d6a18-c983-4731-ae74-ca671d822949\") " pod="openstack/nova-cell1-cell-mapping-h5qh5" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.764573 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf99c\" (UniqueName: \"kubernetes.io/projected/813d6a18-c983-4731-ae74-ca671d822949-kube-api-access-jf99c\") pod \"nova-cell1-cell-mapping-h5qh5\" (UID: \"813d6a18-c983-4731-ae74-ca671d822949\") " pod="openstack/nova-cell1-cell-mapping-h5qh5" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.771476 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-config-data\") pod \"nova-cell1-cell-mapping-h5qh5\" (UID: \"813d6a18-c983-4731-ae74-ca671d822949\") " pod="openstack/nova-cell1-cell-mapping-h5qh5" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.772731 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-scripts\") pod \"nova-cell1-cell-mapping-h5qh5\" (UID: \"813d6a18-c983-4731-ae74-ca671d822949\") " pod="openstack/nova-cell1-cell-mapping-h5qh5" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.774869 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-h5qh5\" (UID: \"813d6a18-c983-4731-ae74-ca671d822949\") " pod="openstack/nova-cell1-cell-mapping-h5qh5" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.783837 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf99c\" (UniqueName: \"kubernetes.io/projected/813d6a18-c983-4731-ae74-ca671d822949-kube-api-access-jf99c\") pod \"nova-cell1-cell-mapping-h5qh5\" (UID: \"813d6a18-c983-4731-ae74-ca671d822949\") " pod="openstack/nova-cell1-cell-mapping-h5qh5" Jan 31 09:21:17 crc kubenswrapper[4783]: I0131 09:21:17.869519 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-h5qh5" Jan 31 09:21:18 crc kubenswrapper[4783]: I0131 09:21:18.280490 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-h5qh5"] Jan 31 09:21:18 crc kubenswrapper[4783]: I0131 09:21:18.338728 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-h5qh5" event={"ID":"813d6a18-c983-4731-ae74-ca671d822949","Type":"ContainerStarted","Data":"27bf56bdc748ce4e1cc35fc54d0de65f5b82b5985db113fcd43a2b23d733b9b7"} Jan 31 09:21:19 crc kubenswrapper[4783]: I0131 09:21:19.350928 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-h5qh5" event={"ID":"813d6a18-c983-4731-ae74-ca671d822949","Type":"ContainerStarted","Data":"b2fa835bac2c2f9da8ceca08b17a1aa1557c3d284a9b06ac7a38e39b29237280"} Jan 31 09:21:19 crc kubenswrapper[4783]: I0131 09:21:19.378033 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-h5qh5" podStartSLOduration=2.378016713 podStartE2EDuration="2.378016713s" podCreationTimestamp="2026-01-31 09:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:21:19.370757798 +0000 UTC m=+990.039441266" watchObservedRunningTime="2026-01-31 09:21:19.378016713 +0000 UTC m=+990.046700171" Jan 31 09:21:19 crc kubenswrapper[4783]: I0131 09:21:19.729131 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 31 09:21:19 crc kubenswrapper[4783]: I0131 09:21:19.729482 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 31 09:21:20 crc kubenswrapper[4783]: I0131 09:21:20.776336 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d2ce0fab-e057-4da3-90c7-58a3841d1fe4" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 31 09:21:20 crc kubenswrapper[4783]: I0131 09:21:20.777000 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d2ce0fab-e057-4da3-90c7-58a3841d1fe4" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 31 09:21:22 crc kubenswrapper[4783]: I0131 09:21:22.393407 4783 generic.go:334] "Generic (PLEG): container finished" podID="813d6a18-c983-4731-ae74-ca671d822949" containerID="b2fa835bac2c2f9da8ceca08b17a1aa1557c3d284a9b06ac7a38e39b29237280" exitCode=0 Jan 31 09:21:22 crc kubenswrapper[4783]: I0131 09:21:22.393502 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-h5qh5" event={"ID":"813d6a18-c983-4731-ae74-ca671d822949","Type":"ContainerDied","Data":"b2fa835bac2c2f9da8ceca08b17a1aa1557c3d284a9b06ac7a38e39b29237280"} Jan 31 09:21:23 crc kubenswrapper[4783]: I0131 09:21:23.704702 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-h5qh5" Jan 31 09:21:23 crc kubenswrapper[4783]: I0131 09:21:23.895432 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-scripts\") pod \"813d6a18-c983-4731-ae74-ca671d822949\" (UID: \"813d6a18-c983-4731-ae74-ca671d822949\") " Jan 31 09:21:23 crc kubenswrapper[4783]: I0131 09:21:23.895597 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-config-data\") pod \"813d6a18-c983-4731-ae74-ca671d822949\" (UID: \"813d6a18-c983-4731-ae74-ca671d822949\") " Jan 31 09:21:23 crc kubenswrapper[4783]: I0131 09:21:23.895683 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-combined-ca-bundle\") pod \"813d6a18-c983-4731-ae74-ca671d822949\" (UID: \"813d6a18-c983-4731-ae74-ca671d822949\") " Jan 31 09:21:23 crc kubenswrapper[4783]: I0131 09:21:23.895843 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf99c\" (UniqueName: \"kubernetes.io/projected/813d6a18-c983-4731-ae74-ca671d822949-kube-api-access-jf99c\") pod \"813d6a18-c983-4731-ae74-ca671d822949\" (UID: \"813d6a18-c983-4731-ae74-ca671d822949\") " Jan 31 09:21:23 crc kubenswrapper[4783]: I0131 09:21:23.902966 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813d6a18-c983-4731-ae74-ca671d822949-kube-api-access-jf99c" (OuterVolumeSpecName: "kube-api-access-jf99c") pod "813d6a18-c983-4731-ae74-ca671d822949" (UID: "813d6a18-c983-4731-ae74-ca671d822949"). InnerVolumeSpecName "kube-api-access-jf99c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:23 crc kubenswrapper[4783]: I0131 09:21:23.903294 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-scripts" (OuterVolumeSpecName: "scripts") pod "813d6a18-c983-4731-ae74-ca671d822949" (UID: "813d6a18-c983-4731-ae74-ca671d822949"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:23 crc kubenswrapper[4783]: I0131 09:21:23.921933 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-config-data" (OuterVolumeSpecName: "config-data") pod "813d6a18-c983-4731-ae74-ca671d822949" (UID: "813d6a18-c983-4731-ae74-ca671d822949"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:23 crc kubenswrapper[4783]: I0131 09:21:23.922594 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "813d6a18-c983-4731-ae74-ca671d822949" (UID: "813d6a18-c983-4731-ae74-ca671d822949"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:23 crc kubenswrapper[4783]: I0131 09:21:23.998724 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf99c\" (UniqueName: \"kubernetes.io/projected/813d6a18-c983-4731-ae74-ca671d822949-kube-api-access-jf99c\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:23 crc kubenswrapper[4783]: I0131 09:21:23.998754 4783 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:23 crc kubenswrapper[4783]: I0131 09:21:23.998764 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:23 crc kubenswrapper[4783]: I0131 09:21:23.998776 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813d6a18-c983-4731-ae74-ca671d822949-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:24 crc kubenswrapper[4783]: I0131 09:21:24.414823 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-h5qh5" event={"ID":"813d6a18-c983-4731-ae74-ca671d822949","Type":"ContainerDied","Data":"27bf56bdc748ce4e1cc35fc54d0de65f5b82b5985db113fcd43a2b23d733b9b7"} Jan 31 09:21:24 crc kubenswrapper[4783]: I0131 09:21:24.414868 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27bf56bdc748ce4e1cc35fc54d0de65f5b82b5985db113fcd43a2b23d733b9b7" Jan 31 09:21:24 crc kubenswrapper[4783]: I0131 09:21:24.414873 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-h5qh5" Jan 31 09:21:24 crc kubenswrapper[4783]: I0131 09:21:24.487973 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:21:24 crc kubenswrapper[4783]: I0131 09:21:24.488351 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d2ce0fab-e057-4da3-90c7-58a3841d1fe4" containerName="nova-api-api" containerID="cri-o://78d97ece5312096faf53a734a6e9149a69cb584b46a8cc7b565e3efa41fe2fff" gracePeriod=30 Jan 31 09:21:24 crc kubenswrapper[4783]: I0131 09:21:24.488360 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d2ce0fab-e057-4da3-90c7-58a3841d1fe4" containerName="nova-api-log" containerID="cri-o://7a579705aefbddc0a01ed1496084f3cb8ba66b87251734ee0d5c7b7c8d230666" gracePeriod=30 Jan 31 09:21:24 crc kubenswrapper[4783]: I0131 09:21:24.494984 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:21:24 crc kubenswrapper[4783]: I0131 09:21:24.495179 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2" containerName="nova-scheduler-scheduler" containerID="cri-o://6736ca4a45576d94914954709afc54430d98d339059bd0381c2ab6a871da8c91" gracePeriod=30 Jan 31 09:21:24 crc kubenswrapper[4783]: I0131 09:21:24.534500 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:21:24 crc kubenswrapper[4783]: I0131 09:21:24.534740 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="031520bc-e4c4-4736-9483-aea3bfcef57e" containerName="nova-metadata-log" containerID="cri-o://f63c4c22d031cad4a7e7180fa6aa247d18773fa10f962cb472d560dfbcb5c32f" gracePeriod=30 Jan 31 09:21:24 crc kubenswrapper[4783]: I0131 09:21:24.534816 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="031520bc-e4c4-4736-9483-aea3bfcef57e" containerName="nova-metadata-metadata" containerID="cri-o://84ae8b3ecd2c55cb136323564ee9dc4ea094da3adf561df449e9e567b11bcf32" gracePeriod=30 Jan 31 09:21:25 crc kubenswrapper[4783]: I0131 09:21:25.427145 4783 generic.go:334] "Generic (PLEG): container finished" podID="031520bc-e4c4-4736-9483-aea3bfcef57e" containerID="f63c4c22d031cad4a7e7180fa6aa247d18773fa10f962cb472d560dfbcb5c32f" exitCode=143 Jan 31 09:21:25 crc kubenswrapper[4783]: I0131 09:21:25.427196 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"031520bc-e4c4-4736-9483-aea3bfcef57e","Type":"ContainerDied","Data":"f63c4c22d031cad4a7e7180fa6aa247d18773fa10f962cb472d560dfbcb5c32f"} Jan 31 09:21:25 crc kubenswrapper[4783]: I0131 09:21:25.431976 4783 generic.go:334] "Generic (PLEG): container finished" podID="d2ce0fab-e057-4da3-90c7-58a3841d1fe4" containerID="7a579705aefbddc0a01ed1496084f3cb8ba66b87251734ee0d5c7b7c8d230666" exitCode=143 Jan 31 09:21:25 crc kubenswrapper[4783]: I0131 09:21:25.432033 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2ce0fab-e057-4da3-90c7-58a3841d1fe4","Type":"ContainerDied","Data":"7a579705aefbddc0a01ed1496084f3cb8ba66b87251734ee0d5c7b7c8d230666"} Jan 31 09:21:26 crc kubenswrapper[4783]: I0131 09:21:26.936650 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:21:26 crc kubenswrapper[4783]: I0131 09:21:26.971306 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-config-data\") pod \"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2\" (UID: \"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2\") " Jan 31 09:21:26 crc kubenswrapper[4783]: I0131 09:21:26.971507 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-combined-ca-bundle\") pod \"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2\" (UID: \"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2\") " Jan 31 09:21:26 crc kubenswrapper[4783]: I0131 09:21:26.971561 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tszvw\" (UniqueName: \"kubernetes.io/projected/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-kube-api-access-tszvw\") pod \"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2\" (UID: \"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2\") " Jan 31 09:21:26 crc kubenswrapper[4783]: I0131 09:21:26.978139 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-kube-api-access-tszvw" (OuterVolumeSpecName: "kube-api-access-tszvw") pod "ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2" (UID: "ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2"). InnerVolumeSpecName "kube-api-access-tszvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.010908 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2" (UID: "ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.014916 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-config-data" (OuterVolumeSpecName: "config-data") pod "ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2" (UID: "ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.073147 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.073213 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.073227 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tszvw\" (UniqueName: \"kubernetes.io/projected/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2-kube-api-access-tszvw\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.494202 4783 generic.go:334] "Generic (PLEG): container finished" podID="ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2" containerID="6736ca4a45576d94914954709afc54430d98d339059bd0381c2ab6a871da8c91" exitCode=0 Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.494246 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2","Type":"ContainerDied","Data":"6736ca4a45576d94914954709afc54430d98d339059bd0381c2ab6a871da8c91"} Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.494270 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2","Type":"ContainerDied","Data":"21344fbbd395ed610598b0db46ddbdb41b125a37903c64a83b90e25635ab7076"} Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.494286 4783 scope.go:117] "RemoveContainer" containerID="6736ca4a45576d94914954709afc54430d98d339059bd0381c2ab6a871da8c91" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.494402 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.526488 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.535628 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.542100 4783 scope.go:117] "RemoveContainer" containerID="6736ca4a45576d94914954709afc54430d98d339059bd0381c2ab6a871da8c91" Jan 31 09:21:27 crc kubenswrapper[4783]: E0131 09:21:27.542603 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6736ca4a45576d94914954709afc54430d98d339059bd0381c2ab6a871da8c91\": container with ID starting with 6736ca4a45576d94914954709afc54430d98d339059bd0381c2ab6a871da8c91 not found: ID does not exist" containerID="6736ca4a45576d94914954709afc54430d98d339059bd0381c2ab6a871da8c91" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.542651 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6736ca4a45576d94914954709afc54430d98d339059bd0381c2ab6a871da8c91"} err="failed to get container status \"6736ca4a45576d94914954709afc54430d98d339059bd0381c2ab6a871da8c91\": rpc error: code = NotFound desc = could not find container \"6736ca4a45576d94914954709afc54430d98d339059bd0381c2ab6a871da8c91\": container with ID starting with 6736ca4a45576d94914954709afc54430d98d339059bd0381c2ab6a871da8c91 not found: ID does not exist" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.547214 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:21:27 crc kubenswrapper[4783]: E0131 09:21:27.547721 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813d6a18-c983-4731-ae74-ca671d822949" containerName="nova-manage" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.547734 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="813d6a18-c983-4731-ae74-ca671d822949" containerName="nova-manage" Jan 31 09:21:27 crc kubenswrapper[4783]: E0131 09:21:27.547768 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2" containerName="nova-scheduler-scheduler" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.547776 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2" containerName="nova-scheduler-scheduler" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.547960 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="813d6a18-c983-4731-ae74-ca671d822949" containerName="nova-manage" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.547985 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2" containerName="nova-scheduler-scheduler" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.548662 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.554852 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.560419 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.585444 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43315408-26b4-4864-af4e-e3cbad195816-config-data\") pod \"nova-scheduler-0\" (UID: \"43315408-26b4-4864-af4e-e3cbad195816\") " pod="openstack/nova-scheduler-0" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.585513 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43315408-26b4-4864-af4e-e3cbad195816-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43315408-26b4-4864-af4e-e3cbad195816\") " pod="openstack/nova-scheduler-0" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.585641 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkv7k\" (UniqueName: \"kubernetes.io/projected/43315408-26b4-4864-af4e-e3cbad195816-kube-api-access-qkv7k\") pod \"nova-scheduler-0\" (UID: \"43315408-26b4-4864-af4e-e3cbad195816\") " pod="openstack/nova-scheduler-0" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.665038 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2" path="/var/lib/kubelet/pods/ad6060e5-96fb-4589-9bc7-2c7fd8fca6b2/volumes" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.687082 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkv7k\" (UniqueName: \"kubernetes.io/projected/43315408-26b4-4864-af4e-e3cbad195816-kube-api-access-qkv7k\") pod \"nova-scheduler-0\" (UID: \"43315408-26b4-4864-af4e-e3cbad195816\") " pod="openstack/nova-scheduler-0" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.687273 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43315408-26b4-4864-af4e-e3cbad195816-config-data\") pod \"nova-scheduler-0\" (UID: \"43315408-26b4-4864-af4e-e3cbad195816\") " pod="openstack/nova-scheduler-0" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.687308 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43315408-26b4-4864-af4e-e3cbad195816-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43315408-26b4-4864-af4e-e3cbad195816\") " pod="openstack/nova-scheduler-0" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.691740 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43315408-26b4-4864-af4e-e3cbad195816-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43315408-26b4-4864-af4e-e3cbad195816\") " pod="openstack/nova-scheduler-0" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.691894 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43315408-26b4-4864-af4e-e3cbad195816-config-data\") pod \"nova-scheduler-0\" (UID: \"43315408-26b4-4864-af4e-e3cbad195816\") " pod="openstack/nova-scheduler-0" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.701965 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkv7k\" (UniqueName: \"kubernetes.io/projected/43315408-26b4-4864-af4e-e3cbad195816-kube-api-access-qkv7k\") pod \"nova-scheduler-0\" (UID: \"43315408-26b4-4864-af4e-e3cbad195816\") " pod="openstack/nova-scheduler-0" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.875009 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.970810 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.991468 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-logs\") pod \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.991503 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-combined-ca-bundle\") pod \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.991531 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-public-tls-certs\") pod \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.991559 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-config-data\") pod \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.991606 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-internal-tls-certs\") pod \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.991647 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fq2j\" (UniqueName: \"kubernetes.io/projected/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-kube-api-access-6fq2j\") pod \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\" (UID: \"d2ce0fab-e057-4da3-90c7-58a3841d1fe4\") " Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.996362 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-logs" (OuterVolumeSpecName: "logs") pod "d2ce0fab-e057-4da3-90c7-58a3841d1fe4" (UID: "d2ce0fab-e057-4da3-90c7-58a3841d1fe4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:21:27 crc kubenswrapper[4783]: I0131 09:21:27.999006 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-kube-api-access-6fq2j" (OuterVolumeSpecName: "kube-api-access-6fq2j") pod "d2ce0fab-e057-4da3-90c7-58a3841d1fe4" (UID: "d2ce0fab-e057-4da3-90c7-58a3841d1fe4"). InnerVolumeSpecName "kube-api-access-6fq2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.019444 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-config-data" (OuterVolumeSpecName: "config-data") pod "d2ce0fab-e057-4da3-90c7-58a3841d1fe4" (UID: "d2ce0fab-e057-4da3-90c7-58a3841d1fe4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.024201 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2ce0fab-e057-4da3-90c7-58a3841d1fe4" (UID: "d2ce0fab-e057-4da3-90c7-58a3841d1fe4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.040538 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.045395 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d2ce0fab-e057-4da3-90c7-58a3841d1fe4" (UID: "d2ce0fab-e057-4da3-90c7-58a3841d1fe4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.053662 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d2ce0fab-e057-4da3-90c7-58a3841d1fe4" (UID: "d2ce0fab-e057-4da3-90c7-58a3841d1fe4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.097145 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-combined-ca-bundle\") pod \"031520bc-e4c4-4736-9483-aea3bfcef57e\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.097273 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/031520bc-e4c4-4736-9483-aea3bfcef57e-logs\") pod \"031520bc-e4c4-4736-9483-aea3bfcef57e\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.097356 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7bdf\" (UniqueName: \"kubernetes.io/projected/031520bc-e4c4-4736-9483-aea3bfcef57e-kube-api-access-h7bdf\") pod \"031520bc-e4c4-4736-9483-aea3bfcef57e\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.097485 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-nova-metadata-tls-certs\") pod \"031520bc-e4c4-4736-9483-aea3bfcef57e\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.097509 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-config-data\") pod \"031520bc-e4c4-4736-9483-aea3bfcef57e\" (UID: \"031520bc-e4c4-4736-9483-aea3bfcef57e\") " Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.097914 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/031520bc-e4c4-4736-9483-aea3bfcef57e-logs" (OuterVolumeSpecName: "logs") pod "031520bc-e4c4-4736-9483-aea3bfcef57e" (UID: "031520bc-e4c4-4736-9483-aea3bfcef57e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.098400 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/031520bc-e4c4-4736-9483-aea3bfcef57e-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.098452 4783 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.098469 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fq2j\" (UniqueName: \"kubernetes.io/projected/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-kube-api-access-6fq2j\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.098505 4783 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.098517 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.098528 4783 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.098540 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2ce0fab-e057-4da3-90c7-58a3841d1fe4-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.101414 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/031520bc-e4c4-4736-9483-aea3bfcef57e-kube-api-access-h7bdf" (OuterVolumeSpecName: "kube-api-access-h7bdf") pod "031520bc-e4c4-4736-9483-aea3bfcef57e" (UID: "031520bc-e4c4-4736-9483-aea3bfcef57e"). InnerVolumeSpecName "kube-api-access-h7bdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.119214 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "031520bc-e4c4-4736-9483-aea3bfcef57e" (UID: "031520bc-e4c4-4736-9483-aea3bfcef57e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.120346 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-config-data" (OuterVolumeSpecName: "config-data") pod "031520bc-e4c4-4736-9483-aea3bfcef57e" (UID: "031520bc-e4c4-4736-9483-aea3bfcef57e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.140542 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "031520bc-e4c4-4736-9483-aea3bfcef57e" (UID: "031520bc-e4c4-4736-9483-aea3bfcef57e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.200733 4783 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.200771 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.200785 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031520bc-e4c4-4736-9483-aea3bfcef57e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.200800 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7bdf\" (UniqueName: \"kubernetes.io/projected/031520bc-e4c4-4736-9483-aea3bfcef57e-kube-api-access-h7bdf\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:28 crc kubenswrapper[4783]: W0131 09:21:28.294181 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43315408_26b4_4864_af4e_e3cbad195816.slice/crio-bb51da1d2c9945f250d2953344847b079862c91d1dfe36da582812be8c2baf32 WatchSource:0}: Error finding container bb51da1d2c9945f250d2953344847b079862c91d1dfe36da582812be8c2baf32: Status 404 returned error can't find the container with id bb51da1d2c9945f250d2953344847b079862c91d1dfe36da582812be8c2baf32 Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.294674 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.508634 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43315408-26b4-4864-af4e-e3cbad195816","Type":"ContainerStarted","Data":"e14d0f65552c8cb4f0c50e5222b2de1dbfb0efe0c8d368fa2f5324919ec4ff01"} Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.508896 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43315408-26b4-4864-af4e-e3cbad195816","Type":"ContainerStarted","Data":"bb51da1d2c9945f250d2953344847b079862c91d1dfe36da582812be8c2baf32"} Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.512945 4783 generic.go:334] "Generic (PLEG): container finished" podID="031520bc-e4c4-4736-9483-aea3bfcef57e" containerID="84ae8b3ecd2c55cb136323564ee9dc4ea094da3adf561df449e9e567b11bcf32" exitCode=0 Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.512995 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"031520bc-e4c4-4736-9483-aea3bfcef57e","Type":"ContainerDied","Data":"84ae8b3ecd2c55cb136323564ee9dc4ea094da3adf561df449e9e567b11bcf32"} Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.513014 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"031520bc-e4c4-4736-9483-aea3bfcef57e","Type":"ContainerDied","Data":"18ac7f8051a12577f354cb39ef14f7364e50a13f36a46f96f3943e2a1c28ca6f"} Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.513034 4783 scope.go:117] "RemoveContainer" containerID="84ae8b3ecd2c55cb136323564ee9dc4ea094da3adf561df449e9e567b11bcf32" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.513128 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.516207 4783 generic.go:334] "Generic (PLEG): container finished" podID="d2ce0fab-e057-4da3-90c7-58a3841d1fe4" containerID="78d97ece5312096faf53a734a6e9149a69cb584b46a8cc7b565e3efa41fe2fff" exitCode=0 Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.516293 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2ce0fab-e057-4da3-90c7-58a3841d1fe4","Type":"ContainerDied","Data":"78d97ece5312096faf53a734a6e9149a69cb584b46a8cc7b565e3efa41fe2fff"} Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.516338 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2ce0fab-e057-4da3-90c7-58a3841d1fe4","Type":"ContainerDied","Data":"c56efa8c98a5808269d6ef6dd6b4c2c58045d0a951f2cb9585f2d648e6688bb6"} Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.516427 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.529994 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.52998202 podStartE2EDuration="1.52998202s" podCreationTimestamp="2026-01-31 09:21:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:21:28.525296667 +0000 UTC m=+999.193980135" watchObservedRunningTime="2026-01-31 09:21:28.52998202 +0000 UTC m=+999.198665488" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.549472 4783 scope.go:117] "RemoveContainer" containerID="f63c4c22d031cad4a7e7180fa6aa247d18773fa10f962cb472d560dfbcb5c32f" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.554274 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.565248 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.567821 4783 scope.go:117] "RemoveContainer" containerID="84ae8b3ecd2c55cb136323564ee9dc4ea094da3adf561df449e9e567b11bcf32" Jan 31 09:21:28 crc kubenswrapper[4783]: E0131 09:21:28.568211 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84ae8b3ecd2c55cb136323564ee9dc4ea094da3adf561df449e9e567b11bcf32\": container with ID starting with 84ae8b3ecd2c55cb136323564ee9dc4ea094da3adf561df449e9e567b11bcf32 not found: ID does not exist" containerID="84ae8b3ecd2c55cb136323564ee9dc4ea094da3adf561df449e9e567b11bcf32" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.568259 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ae8b3ecd2c55cb136323564ee9dc4ea094da3adf561df449e9e567b11bcf32"} err="failed to get container status \"84ae8b3ecd2c55cb136323564ee9dc4ea094da3adf561df449e9e567b11bcf32\": rpc error: code = NotFound desc = could not find container \"84ae8b3ecd2c55cb136323564ee9dc4ea094da3adf561df449e9e567b11bcf32\": container with ID starting with 84ae8b3ecd2c55cb136323564ee9dc4ea094da3adf561df449e9e567b11bcf32 not found: ID does not exist" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.568279 4783 scope.go:117] "RemoveContainer" containerID="f63c4c22d031cad4a7e7180fa6aa247d18773fa10f962cb472d560dfbcb5c32f" Jan 31 09:21:28 crc kubenswrapper[4783]: E0131 09:21:28.568487 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f63c4c22d031cad4a7e7180fa6aa247d18773fa10f962cb472d560dfbcb5c32f\": container with ID starting with f63c4c22d031cad4a7e7180fa6aa247d18773fa10f962cb472d560dfbcb5c32f not found: ID does not exist" containerID="f63c4c22d031cad4a7e7180fa6aa247d18773fa10f962cb472d560dfbcb5c32f" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.568510 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f63c4c22d031cad4a7e7180fa6aa247d18773fa10f962cb472d560dfbcb5c32f"} err="failed to get container status \"f63c4c22d031cad4a7e7180fa6aa247d18773fa10f962cb472d560dfbcb5c32f\": rpc error: code = NotFound desc = could not find container \"f63c4c22d031cad4a7e7180fa6aa247d18773fa10f962cb472d560dfbcb5c32f\": container with ID starting with f63c4c22d031cad4a7e7180fa6aa247d18773fa10f962cb472d560dfbcb5c32f not found: ID does not exist" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.568523 4783 scope.go:117] "RemoveContainer" containerID="78d97ece5312096faf53a734a6e9149a69cb584b46a8cc7b565e3efa41fe2fff" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.572050 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.583285 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:21:28 crc kubenswrapper[4783]: E0131 09:21:28.583812 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ce0fab-e057-4da3-90c7-58a3841d1fe4" containerName="nova-api-api" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.583832 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ce0fab-e057-4da3-90c7-58a3841d1fe4" containerName="nova-api-api" Jan 31 09:21:28 crc kubenswrapper[4783]: E0131 09:21:28.583862 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ce0fab-e057-4da3-90c7-58a3841d1fe4" containerName="nova-api-log" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.583869 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ce0fab-e057-4da3-90c7-58a3841d1fe4" containerName="nova-api-log" Jan 31 09:21:28 crc kubenswrapper[4783]: E0131 09:21:28.583891 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031520bc-e4c4-4736-9483-aea3bfcef57e" containerName="nova-metadata-metadata" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.583897 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="031520bc-e4c4-4736-9483-aea3bfcef57e" containerName="nova-metadata-metadata" Jan 31 09:21:28 crc kubenswrapper[4783]: E0131 09:21:28.583909 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031520bc-e4c4-4736-9483-aea3bfcef57e" containerName="nova-metadata-log" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.583916 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="031520bc-e4c4-4736-9483-aea3bfcef57e" containerName="nova-metadata-log" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.584100 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="031520bc-e4c4-4736-9483-aea3bfcef57e" containerName="nova-metadata-metadata" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.584114 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2ce0fab-e057-4da3-90c7-58a3841d1fe4" containerName="nova-api-api" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.584132 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="031520bc-e4c4-4736-9483-aea3bfcef57e" containerName="nova-metadata-log" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.584149 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2ce0fab-e057-4da3-90c7-58a3841d1fe4" containerName="nova-api-log" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.585193 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.587113 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.587306 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.593415 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.594108 4783 scope.go:117] "RemoveContainer" containerID="7a579705aefbddc0a01ed1496084f3cb8ba66b87251734ee0d5c7b7c8d230666" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.599129 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.600750 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.605185 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.607917 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/011e6c87-b549-4268-b64e-b5d49c9e7cd8-logs\") pod \"nova-metadata-0\" (UID: \"011e6c87-b549-4268-b64e-b5d49c9e7cd8\") " pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.607979 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011e6c87-b549-4268-b64e-b5d49c9e7cd8-config-data\") pod \"nova-metadata-0\" (UID: \"011e6c87-b549-4268-b64e-b5d49c9e7cd8\") " pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.608072 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp299\" (UniqueName: \"kubernetes.io/projected/011e6c87-b549-4268-b64e-b5d49c9e7cd8-kube-api-access-dp299\") pod \"nova-metadata-0\" (UID: \"011e6c87-b549-4268-b64e-b5d49c9e7cd8\") " pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.608121 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011e6c87-b549-4268-b64e-b5d49c9e7cd8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"011e6c87-b549-4268-b64e-b5d49c9e7cd8\") " pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.608228 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/011e6c87-b549-4268-b64e-b5d49c9e7cd8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"011e6c87-b549-4268-b64e-b5d49c9e7cd8\") " pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.609357 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.612820 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.613013 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.613173 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.676754 4783 scope.go:117] "RemoveContainer" containerID="78d97ece5312096faf53a734a6e9149a69cb584b46a8cc7b565e3efa41fe2fff" Jan 31 09:21:28 crc kubenswrapper[4783]: E0131 09:21:28.677367 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78d97ece5312096faf53a734a6e9149a69cb584b46a8cc7b565e3efa41fe2fff\": container with ID starting with 78d97ece5312096faf53a734a6e9149a69cb584b46a8cc7b565e3efa41fe2fff not found: ID does not exist" containerID="78d97ece5312096faf53a734a6e9149a69cb584b46a8cc7b565e3efa41fe2fff" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.677417 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d97ece5312096faf53a734a6e9149a69cb584b46a8cc7b565e3efa41fe2fff"} err="failed to get container status \"78d97ece5312096faf53a734a6e9149a69cb584b46a8cc7b565e3efa41fe2fff\": rpc error: code = NotFound desc = could not find container \"78d97ece5312096faf53a734a6e9149a69cb584b46a8cc7b565e3efa41fe2fff\": container with ID starting with 78d97ece5312096faf53a734a6e9149a69cb584b46a8cc7b565e3efa41fe2fff not found: ID does not exist" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.677453 4783 scope.go:117] "RemoveContainer" containerID="7a579705aefbddc0a01ed1496084f3cb8ba66b87251734ee0d5c7b7c8d230666" Jan 31 09:21:28 crc kubenswrapper[4783]: E0131 09:21:28.678716 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a579705aefbddc0a01ed1496084f3cb8ba66b87251734ee0d5c7b7c8d230666\": container with ID starting with 7a579705aefbddc0a01ed1496084f3cb8ba66b87251734ee0d5c7b7c8d230666 not found: ID does not exist" containerID="7a579705aefbddc0a01ed1496084f3cb8ba66b87251734ee0d5c7b7c8d230666" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.678768 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a579705aefbddc0a01ed1496084f3cb8ba66b87251734ee0d5c7b7c8d230666"} err="failed to get container status \"7a579705aefbddc0a01ed1496084f3cb8ba66b87251734ee0d5c7b7c8d230666\": rpc error: code = NotFound desc = could not find container \"7a579705aefbddc0a01ed1496084f3cb8ba66b87251734ee0d5c7b7c8d230666\": container with ID starting with 7a579705aefbddc0a01ed1496084f3cb8ba66b87251734ee0d5c7b7c8d230666 not found: ID does not exist" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.710351 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-config-data\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.710414 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp299\" (UniqueName: \"kubernetes.io/projected/011e6c87-b549-4268-b64e-b5d49c9e7cd8-kube-api-access-dp299\") pod \"nova-metadata-0\" (UID: \"011e6c87-b549-4268-b64e-b5d49c9e7cd8\") " pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.710449 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011e6c87-b549-4268-b64e-b5d49c9e7cd8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"011e6c87-b549-4268-b64e-b5d49c9e7cd8\") " pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.710494 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.710541 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.710583 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-public-tls-certs\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.710609 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/011e6c87-b549-4268-b64e-b5d49c9e7cd8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"011e6c87-b549-4268-b64e-b5d49c9e7cd8\") " pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.710709 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-logs\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.710827 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/011e6c87-b549-4268-b64e-b5d49c9e7cd8-logs\") pod \"nova-metadata-0\" (UID: \"011e6c87-b549-4268-b64e-b5d49c9e7cd8\") " pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.710939 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011e6c87-b549-4268-b64e-b5d49c9e7cd8-config-data\") pod \"nova-metadata-0\" (UID: \"011e6c87-b549-4268-b64e-b5d49c9e7cd8\") " pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.711113 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7nhn\" (UniqueName: \"kubernetes.io/projected/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-kube-api-access-c7nhn\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.711358 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/011e6c87-b549-4268-b64e-b5d49c9e7cd8-logs\") pod \"nova-metadata-0\" (UID: \"011e6c87-b549-4268-b64e-b5d49c9e7cd8\") " pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.715155 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011e6c87-b549-4268-b64e-b5d49c9e7cd8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"011e6c87-b549-4268-b64e-b5d49c9e7cd8\") " pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.715245 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/011e6c87-b549-4268-b64e-b5d49c9e7cd8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"011e6c87-b549-4268-b64e-b5d49c9e7cd8\") " pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.715817 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011e6c87-b549-4268-b64e-b5d49c9e7cd8-config-data\") pod \"nova-metadata-0\" (UID: \"011e6c87-b549-4268-b64e-b5d49c9e7cd8\") " pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.724803 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp299\" (UniqueName: \"kubernetes.io/projected/011e6c87-b549-4268-b64e-b5d49c9e7cd8-kube-api-access-dp299\") pod \"nova-metadata-0\" (UID: \"011e6c87-b549-4268-b64e-b5d49c9e7cd8\") " pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.812419 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-config-data\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.813444 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.813599 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.813726 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-public-tls-certs\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.813815 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-logs\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.814063 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7nhn\" (UniqueName: \"kubernetes.io/projected/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-kube-api-access-c7nhn\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.814373 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-logs\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.818250 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-public-tls-certs\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.818544 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.818559 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.818711 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-config-data\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.829196 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7nhn\" (UniqueName: \"kubernetes.io/projected/b4163808-2c2d-4fdd-a2b3-84b36dfa4112-kube-api-access-c7nhn\") pod \"nova-api-0\" (UID: \"b4163808-2c2d-4fdd-a2b3-84b36dfa4112\") " pod="openstack/nova-api-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.962538 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:21:28 crc kubenswrapper[4783]: I0131 09:21:28.972934 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:21:29 crc kubenswrapper[4783]: I0131 09:21:29.395997 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:21:29 crc kubenswrapper[4783]: W0131 09:21:29.441509 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4163808_2c2d_4fdd_a2b3_84b36dfa4112.slice/crio-c9e54c739bca948bc928acfb858e19c2115f2a481845b640ae56815d0bde6f09 WatchSource:0}: Error finding container c9e54c739bca948bc928acfb858e19c2115f2a481845b640ae56815d0bde6f09: Status 404 returned error can't find the container with id c9e54c739bca948bc928acfb858e19c2115f2a481845b640ae56815d0bde6f09 Jan 31 09:21:29 crc kubenswrapper[4783]: I0131 09:21:29.444133 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:21:29 crc kubenswrapper[4783]: I0131 09:21:29.534289 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"011e6c87-b549-4268-b64e-b5d49c9e7cd8","Type":"ContainerStarted","Data":"77ea4e3c57a6c6f7c3a9db914f0ccfd2c1c785019727f27aa3fa65d98f47af2c"} Jan 31 09:21:29 crc kubenswrapper[4783]: I0131 09:21:29.534629 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"011e6c87-b549-4268-b64e-b5d49c9e7cd8","Type":"ContainerStarted","Data":"c12b7d72ba779ca5355fa5b8951072af2eb620da8cce2b23875374f5c505d289"} Jan 31 09:21:29 crc kubenswrapper[4783]: I0131 09:21:29.539839 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b4163808-2c2d-4fdd-a2b3-84b36dfa4112","Type":"ContainerStarted","Data":"c9e54c739bca948bc928acfb858e19c2115f2a481845b640ae56815d0bde6f09"} Jan 31 09:21:29 crc kubenswrapper[4783]: I0131 09:21:29.669803 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="031520bc-e4c4-4736-9483-aea3bfcef57e" path="/var/lib/kubelet/pods/031520bc-e4c4-4736-9483-aea3bfcef57e/volumes" Jan 31 09:21:29 crc kubenswrapper[4783]: I0131 09:21:29.670440 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2ce0fab-e057-4da3-90c7-58a3841d1fe4" path="/var/lib/kubelet/pods/d2ce0fab-e057-4da3-90c7-58a3841d1fe4/volumes" Jan 31 09:21:30 crc kubenswrapper[4783]: I0131 09:21:30.549376 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"011e6c87-b549-4268-b64e-b5d49c9e7cd8","Type":"ContainerStarted","Data":"9aeac76369c2482675f3a0abd08311c59ee784c6485752789d4321c5b5f966fd"} Jan 31 09:21:30 crc kubenswrapper[4783]: I0131 09:21:30.552527 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b4163808-2c2d-4fdd-a2b3-84b36dfa4112","Type":"ContainerStarted","Data":"56fe73e90eff1b634ead17b4c55fca353c0526025e3213e9bc9370dd28b85da6"} Jan 31 09:21:30 crc kubenswrapper[4783]: I0131 09:21:30.552646 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b4163808-2c2d-4fdd-a2b3-84b36dfa4112","Type":"ContainerStarted","Data":"07bb434d0106d02a9a9e5e7f80e0ae5a3ba9fc8f60e719d3d0a405c9c157089d"} Jan 31 09:21:30 crc kubenswrapper[4783]: I0131 09:21:30.571541 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.571481703 podStartE2EDuration="2.571481703s" podCreationTimestamp="2026-01-31 09:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:21:30.565143053 +0000 UTC m=+1001.233826520" watchObservedRunningTime="2026-01-31 09:21:30.571481703 +0000 UTC m=+1001.240165171" Jan 31 09:21:30 crc kubenswrapper[4783]: I0131 09:21:30.585674 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5856581480000003 podStartE2EDuration="2.585658148s" podCreationTimestamp="2026-01-31 09:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:21:30.582010902 +0000 UTC m=+1001.250694360" watchObservedRunningTime="2026-01-31 09:21:30.585658148 +0000 UTC m=+1001.254341617" Jan 31 09:21:32 crc kubenswrapper[4783]: I0131 09:21:32.875573 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 31 09:21:33 crc kubenswrapper[4783]: I0131 09:21:33.963272 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 31 09:21:33 crc kubenswrapper[4783]: I0131 09:21:33.963348 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 31 09:21:37 crc kubenswrapper[4783]: I0131 09:21:37.598969 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 31 09:21:37 crc kubenswrapper[4783]: I0131 09:21:37.875418 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 31 09:21:37 crc kubenswrapper[4783]: I0131 09:21:37.902900 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 31 09:21:38 crc kubenswrapper[4783]: I0131 09:21:38.647525 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 31 09:21:38 crc kubenswrapper[4783]: I0131 09:21:38.963311 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 31 09:21:38 crc kubenswrapper[4783]: I0131 09:21:38.963716 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 31 09:21:38 crc kubenswrapper[4783]: I0131 09:21:38.974355 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 31 09:21:38 crc kubenswrapper[4783]: I0131 09:21:38.975063 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 31 09:21:39 crc kubenswrapper[4783]: I0131 09:21:39.982353 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="011e6c87-b549-4268-b64e-b5d49c9e7cd8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 31 09:21:39 crc kubenswrapper[4783]: I0131 09:21:39.982364 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="011e6c87-b549-4268-b64e-b5d49c9e7cd8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 31 09:21:39 crc kubenswrapper[4783]: I0131 09:21:39.993266 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b4163808-2c2d-4fdd-a2b3-84b36dfa4112" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 31 09:21:39 crc kubenswrapper[4783]: I0131 09:21:39.993304 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b4163808-2c2d-4fdd-a2b3-84b36dfa4112" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 31 09:21:48 crc kubenswrapper[4783]: I0131 09:21:48.968656 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 31 09:21:48 crc kubenswrapper[4783]: I0131 09:21:48.971539 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 31 09:21:48 crc kubenswrapper[4783]: I0131 09:21:48.975681 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 31 09:21:48 crc kubenswrapper[4783]: I0131 09:21:48.979271 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 31 09:21:48 crc kubenswrapper[4783]: I0131 09:21:48.979680 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 31 09:21:48 crc kubenswrapper[4783]: I0131 09:21:48.980086 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 31 09:21:48 crc kubenswrapper[4783]: I0131 09:21:48.988533 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 31 09:21:49 crc kubenswrapper[4783]: I0131 09:21:49.722426 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 31 09:21:49 crc kubenswrapper[4783]: I0131 09:21:49.728509 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 31 09:21:49 crc kubenswrapper[4783]: I0131 09:21:49.728638 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 31 09:21:56 crc kubenswrapper[4783]: I0131 09:21:56.509921 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:21:57 crc kubenswrapper[4783]: I0131 09:21:57.246134 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:22:00 crc kubenswrapper[4783]: I0131 09:22:00.592919 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e44f3996-11b5-4095-a1f3-e1bc24974386" containerName="rabbitmq" containerID="cri-o://aed3b74b7093db8a32cf49978aa4e68d9e7e71074ce9ae62f92106c4466f7f8c" gracePeriod=604796 Jan 31 09:22:01 crc kubenswrapper[4783]: I0131 09:22:01.039524 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="1aa1eeb1-d389-4933-a40b-3383b28597c2" containerName="rabbitmq" containerID="cri-o://6d6d3cb2cb002f9fcf548645595b8de8b5379e5124a23c0a84985e52c5add160" gracePeriod=604797 Jan 31 09:22:03 crc kubenswrapper[4783]: I0131 09:22:03.777133 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e44f3996-11b5-4095-a1f3-e1bc24974386" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.95:5671: connect: connection refused" Jan 31 09:22:04 crc kubenswrapper[4783]: I0131 09:22:04.046777 4783 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1aa1eeb1-d389-4933-a40b-3383b28597c2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.96:5671: connect: connection refused" Jan 31 09:22:06 crc kubenswrapper[4783]: I0131 09:22:06.879621 4783 generic.go:334] "Generic (PLEG): container finished" podID="e44f3996-11b5-4095-a1f3-e1bc24974386" containerID="aed3b74b7093db8a32cf49978aa4e68d9e7e71074ce9ae62f92106c4466f7f8c" exitCode=0 Jan 31 09:22:06 crc kubenswrapper[4783]: I0131 09:22:06.879776 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e44f3996-11b5-4095-a1f3-e1bc24974386","Type":"ContainerDied","Data":"aed3b74b7093db8a32cf49978aa4e68d9e7e71074ce9ae62f92106c4466f7f8c"} Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.320028 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.418306 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e44f3996-11b5-4095-a1f3-e1bc24974386-erlang-cookie-secret\") pod \"e44f3996-11b5-4095-a1f3-e1bc24974386\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.418386 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-erlang-cookie\") pod \"e44f3996-11b5-4095-a1f3-e1bc24974386\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.418439 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e44f3996-11b5-4095-a1f3-e1bc24974386-pod-info\") pod \"e44f3996-11b5-4095-a1f3-e1bc24974386\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.418476 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-plugins-conf\") pod \"e44f3996-11b5-4095-a1f3-e1bc24974386\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.418551 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-server-conf\") pod \"e44f3996-11b5-4095-a1f3-e1bc24974386\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.418596 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnp8h\" (UniqueName: \"kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-kube-api-access-vnp8h\") pod \"e44f3996-11b5-4095-a1f3-e1bc24974386\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.418636 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-plugins\") pod \"e44f3996-11b5-4095-a1f3-e1bc24974386\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.418695 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-confd\") pod \"e44f3996-11b5-4095-a1f3-e1bc24974386\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.418737 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"e44f3996-11b5-4095-a1f3-e1bc24974386\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.418766 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-tls\") pod \"e44f3996-11b5-4095-a1f3-e1bc24974386\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.418829 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-config-data\") pod \"e44f3996-11b5-4095-a1f3-e1bc24974386\" (UID: \"e44f3996-11b5-4095-a1f3-e1bc24974386\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.423528 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e44f3996-11b5-4095-a1f3-e1bc24974386" (UID: "e44f3996-11b5-4095-a1f3-e1bc24974386"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.426965 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e44f3996-11b5-4095-a1f3-e1bc24974386-pod-info" (OuterVolumeSpecName: "pod-info") pod "e44f3996-11b5-4095-a1f3-e1bc24974386" (UID: "e44f3996-11b5-4095-a1f3-e1bc24974386"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.427325 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e44f3996-11b5-4095-a1f3-e1bc24974386" (UID: "e44f3996-11b5-4095-a1f3-e1bc24974386"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.427419 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e44f3996-11b5-4095-a1f3-e1bc24974386-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e44f3996-11b5-4095-a1f3-e1bc24974386" (UID: "e44f3996-11b5-4095-a1f3-e1bc24974386"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.427814 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e44f3996-11b5-4095-a1f3-e1bc24974386" (UID: "e44f3996-11b5-4095-a1f3-e1bc24974386"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.432540 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e44f3996-11b5-4095-a1f3-e1bc24974386" (UID: "e44f3996-11b5-4095-a1f3-e1bc24974386"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.438649 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-kube-api-access-vnp8h" (OuterVolumeSpecName: "kube-api-access-vnp8h") pod "e44f3996-11b5-4095-a1f3-e1bc24974386" (UID: "e44f3996-11b5-4095-a1f3-e1bc24974386"). InnerVolumeSpecName "kube-api-access-vnp8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.453499 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-config-data" (OuterVolumeSpecName: "config-data") pod "e44f3996-11b5-4095-a1f3-e1bc24974386" (UID: "e44f3996-11b5-4095-a1f3-e1bc24974386"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.455370 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "e44f3996-11b5-4095-a1f3-e1bc24974386" (UID: "e44f3996-11b5-4095-a1f3-e1bc24974386"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.480605 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-server-conf" (OuterVolumeSpecName: "server-conf") pod "e44f3996-11b5-4095-a1f3-e1bc24974386" (UID: "e44f3996-11b5-4095-a1f3-e1bc24974386"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.522773 4783 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e44f3996-11b5-4095-a1f3-e1bc24974386-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.522807 4783 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.522818 4783 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e44f3996-11b5-4095-a1f3-e1bc24974386-pod-info\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.522829 4783 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.522838 4783 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-server-conf\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.522846 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnp8h\" (UniqueName: \"kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-kube-api-access-vnp8h\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.522858 4783 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.522892 4783 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.522901 4783 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.522908 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e44f3996-11b5-4095-a1f3-e1bc24974386-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.568817 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-5pdvt"] Jan 31 09:22:07 crc kubenswrapper[4783]: E0131 09:22:07.569217 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e44f3996-11b5-4095-a1f3-e1bc24974386" containerName="setup-container" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.569230 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e44f3996-11b5-4095-a1f3-e1bc24974386" containerName="setup-container" Jan 31 09:22:07 crc kubenswrapper[4783]: E0131 09:22:07.569249 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e44f3996-11b5-4095-a1f3-e1bc24974386" containerName="rabbitmq" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.569254 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e44f3996-11b5-4095-a1f3-e1bc24974386" containerName="rabbitmq" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.569421 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="e44f3996-11b5-4095-a1f3-e1bc24974386" containerName="rabbitmq" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.586341 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.592537 4783 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.592922 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-5pdvt"] Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.593739 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.625617 4783 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.631780 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e44f3996-11b5-4095-a1f3-e1bc24974386" (UID: "e44f3996-11b5-4095-a1f3-e1bc24974386"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.674785 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.728966 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-config\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.729012 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-openstack-edpm-ipam\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.729041 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-dns-svc\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.729064 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-ovsdbserver-sb\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.729189 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smblk\" (UniqueName: \"kubernetes.io/projected/e3e01e6e-b0e5-4746-b235-85dde1fc7084-kube-api-access-smblk\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.729210 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-dns-swift-storage-0\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.729242 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-ovsdbserver-nb\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.729298 4783 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e44f3996-11b5-4095-a1f3-e1bc24974386-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.830827 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-server-conf\") pod \"1aa1eeb1-d389-4933-a40b-3383b28597c2\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.830871 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"1aa1eeb1-d389-4933-a40b-3383b28597c2\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.830919 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1aa1eeb1-d389-4933-a40b-3383b28597c2-erlang-cookie-secret\") pod \"1aa1eeb1-d389-4933-a40b-3383b28597c2\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.830936 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-config-data\") pod \"1aa1eeb1-d389-4933-a40b-3383b28597c2\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.830998 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-plugins\") pod \"1aa1eeb1-d389-4933-a40b-3383b28597c2\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.831052 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-confd\") pod \"1aa1eeb1-d389-4933-a40b-3383b28597c2\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.831075 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-erlang-cookie\") pod \"1aa1eeb1-d389-4933-a40b-3383b28597c2\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.831094 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kprh4\" (UniqueName: \"kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-kube-api-access-kprh4\") pod \"1aa1eeb1-d389-4933-a40b-3383b28597c2\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.831116 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-plugins-conf\") pod \"1aa1eeb1-d389-4933-a40b-3383b28597c2\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.831205 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1aa1eeb1-d389-4933-a40b-3383b28597c2-pod-info\") pod \"1aa1eeb1-d389-4933-a40b-3383b28597c2\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.831258 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-tls\") pod \"1aa1eeb1-d389-4933-a40b-3383b28597c2\" (UID: \"1aa1eeb1-d389-4933-a40b-3383b28597c2\") " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.831667 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smblk\" (UniqueName: \"kubernetes.io/projected/e3e01e6e-b0e5-4746-b235-85dde1fc7084-kube-api-access-smblk\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.831701 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-dns-swift-storage-0\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.831750 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-ovsdbserver-nb\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.831799 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-config\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.831816 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-openstack-edpm-ipam\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.831838 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-dns-svc\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.831858 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-ovsdbserver-sb\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.831855 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1aa1eeb1-d389-4933-a40b-3383b28597c2" (UID: "1aa1eeb1-d389-4933-a40b-3383b28597c2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.832013 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1aa1eeb1-d389-4933-a40b-3383b28597c2" (UID: "1aa1eeb1-d389-4933-a40b-3383b28597c2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.832798 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1aa1eeb1-d389-4933-a40b-3383b28597c2" (UID: "1aa1eeb1-d389-4933-a40b-3383b28597c2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.834512 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-ovsdbserver-nb\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.834929 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-dns-swift-storage-0\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.835472 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-openstack-edpm-ipam\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.836003 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-config\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.838466 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa1eeb1-d389-4933-a40b-3383b28597c2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1aa1eeb1-d389-4933-a40b-3383b28597c2" (UID: "1aa1eeb1-d389-4933-a40b-3383b28597c2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.838517 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-dns-svc\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.838886 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-ovsdbserver-sb\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.840292 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "1aa1eeb1-d389-4933-a40b-3383b28597c2" (UID: "1aa1eeb1-d389-4933-a40b-3383b28597c2"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.852282 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-kube-api-access-kprh4" (OuterVolumeSpecName: "kube-api-access-kprh4") pod "1aa1eeb1-d389-4933-a40b-3383b28597c2" (UID: "1aa1eeb1-d389-4933-a40b-3383b28597c2"). InnerVolumeSpecName "kube-api-access-kprh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.852313 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1aa1eeb1-d389-4933-a40b-3383b28597c2" (UID: "1aa1eeb1-d389-4933-a40b-3383b28597c2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.862933 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smblk\" (UniqueName: \"kubernetes.io/projected/e3e01e6e-b0e5-4746-b235-85dde1fc7084-kube-api-access-smblk\") pod \"dnsmasq-dns-668b55cdd7-5pdvt\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.876347 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1aa1eeb1-d389-4933-a40b-3383b28597c2-pod-info" (OuterVolumeSpecName: "pod-info") pod "1aa1eeb1-d389-4933-a40b-3383b28597c2" (UID: "1aa1eeb1-d389-4933-a40b-3383b28597c2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.892510 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e44f3996-11b5-4095-a1f3-e1bc24974386","Type":"ContainerDied","Data":"4610be82240c0e59547139cbf9cadcc6a98c5ddac8c30e568d2c1057efb0bfe1"} Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.892574 4783 scope.go:117] "RemoveContainer" containerID="aed3b74b7093db8a32cf49978aa4e68d9e7e71074ce9ae62f92106c4466f7f8c" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.892733 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.893136 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-server-conf" (OuterVolumeSpecName: "server-conf") pod "1aa1eeb1-d389-4933-a40b-3383b28597c2" (UID: "1aa1eeb1-d389-4933-a40b-3383b28597c2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.895633 4783 generic.go:334] "Generic (PLEG): container finished" podID="1aa1eeb1-d389-4933-a40b-3383b28597c2" containerID="6d6d3cb2cb002f9fcf548645595b8de8b5379e5124a23c0a84985e52c5add160" exitCode=0 Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.895933 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.897100 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1aa1eeb1-d389-4933-a40b-3383b28597c2","Type":"ContainerDied","Data":"6d6d3cb2cb002f9fcf548645595b8de8b5379e5124a23c0a84985e52c5add160"} Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.897119 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1aa1eeb1-d389-4933-a40b-3383b28597c2","Type":"ContainerDied","Data":"31f705b1312ba3a9b8bcaff19857e3818870b150542ad99f835a9eca30384dbc"} Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.911864 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-config-data" (OuterVolumeSpecName: "config-data") pod "1aa1eeb1-d389-4933-a40b-3383b28597c2" (UID: "1aa1eeb1-d389-4933-a40b-3383b28597c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.918454 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.933952 4783 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1aa1eeb1-d389-4933-a40b-3383b28597c2-pod-info\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.933976 4783 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.933986 4783 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-server-conf\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.934008 4783 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.934018 4783 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1aa1eeb1-d389-4933-a40b-3383b28597c2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.934027 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.934034 4783 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.934043 4783 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.934051 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kprh4\" (UniqueName: \"kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-kube-api-access-kprh4\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.934058 4783 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1aa1eeb1-d389-4933-a40b-3383b28597c2-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.955552 4783 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.973407 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1aa1eeb1-d389-4933-a40b-3383b28597c2" (UID: "1aa1eeb1-d389-4933-a40b-3383b28597c2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:22:07 crc kubenswrapper[4783]: I0131 09:22:07.998525 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.001401 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.018728 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:22:08 crc kubenswrapper[4783]: E0131 09:22:08.019117 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa1eeb1-d389-4933-a40b-3383b28597c2" containerName="setup-container" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.019138 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa1eeb1-d389-4933-a40b-3383b28597c2" containerName="setup-container" Jan 31 09:22:08 crc kubenswrapper[4783]: E0131 09:22:08.019148 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa1eeb1-d389-4933-a40b-3383b28597c2" containerName="rabbitmq" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.019154 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa1eeb1-d389-4933-a40b-3383b28597c2" containerName="rabbitmq" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.019473 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa1eeb1-d389-4933-a40b-3383b28597c2" containerName="rabbitmq" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.020403 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.021932 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.022926 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.023147 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.023347 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dcjql" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.023488 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.025722 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.026285 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.032229 4783 scope.go:117] "RemoveContainer" containerID="e7e18d5ab9b16321ee2a0c8b2935712bc5ad499d23d1ae5dc9633f021df16c76" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.036554 4783 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1aa1eeb1-d389-4933-a40b-3383b28597c2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.036578 4783 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.057925 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.070519 4783 scope.go:117] "RemoveContainer" containerID="6d6d3cb2cb002f9fcf548645595b8de8b5379e5124a23c0a84985e52c5add160" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.122409 4783 scope.go:117] "RemoveContainer" containerID="e2b77e2267fe204a89c641d60d263440188dd11b1536b9062ded3cf26ce92800" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.141601 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4902a1ee-5b54-48bd-b8fb-8be63db315a5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.141647 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4902a1ee-5b54-48bd-b8fb-8be63db315a5-config-data\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.141692 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4902a1ee-5b54-48bd-b8fb-8be63db315a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.141722 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4902a1ee-5b54-48bd-b8fb-8be63db315a5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.141744 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4902a1ee-5b54-48bd-b8fb-8be63db315a5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.141764 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4902a1ee-5b54-48bd-b8fb-8be63db315a5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.141792 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4902a1ee-5b54-48bd-b8fb-8be63db315a5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.141823 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4902a1ee-5b54-48bd-b8fb-8be63db315a5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.141852 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4902a1ee-5b54-48bd-b8fb-8be63db315a5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.141880 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.141904 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8d77\" (UniqueName: \"kubernetes.io/projected/4902a1ee-5b54-48bd-b8fb-8be63db315a5-kube-api-access-g8d77\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.148083 4783 scope.go:117] "RemoveContainer" containerID="6d6d3cb2cb002f9fcf548645595b8de8b5379e5124a23c0a84985e52c5add160" Jan 31 09:22:08 crc kubenswrapper[4783]: E0131 09:22:08.148517 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d6d3cb2cb002f9fcf548645595b8de8b5379e5124a23c0a84985e52c5add160\": container with ID starting with 6d6d3cb2cb002f9fcf548645595b8de8b5379e5124a23c0a84985e52c5add160 not found: ID does not exist" containerID="6d6d3cb2cb002f9fcf548645595b8de8b5379e5124a23c0a84985e52c5add160" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.148564 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6d3cb2cb002f9fcf548645595b8de8b5379e5124a23c0a84985e52c5add160"} err="failed to get container status \"6d6d3cb2cb002f9fcf548645595b8de8b5379e5124a23c0a84985e52c5add160\": rpc error: code = NotFound desc = could not find container \"6d6d3cb2cb002f9fcf548645595b8de8b5379e5124a23c0a84985e52c5add160\": container with ID starting with 6d6d3cb2cb002f9fcf548645595b8de8b5379e5124a23c0a84985e52c5add160 not found: ID does not exist" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.148597 4783 scope.go:117] "RemoveContainer" containerID="e2b77e2267fe204a89c641d60d263440188dd11b1536b9062ded3cf26ce92800" Jan 31 09:22:08 crc kubenswrapper[4783]: E0131 09:22:08.148926 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2b77e2267fe204a89c641d60d263440188dd11b1536b9062ded3cf26ce92800\": container with ID starting with e2b77e2267fe204a89c641d60d263440188dd11b1536b9062ded3cf26ce92800 not found: ID does not exist" containerID="e2b77e2267fe204a89c641d60d263440188dd11b1536b9062ded3cf26ce92800" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.148986 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2b77e2267fe204a89c641d60d263440188dd11b1536b9062ded3cf26ce92800"} err="failed to get container status \"e2b77e2267fe204a89c641d60d263440188dd11b1536b9062ded3cf26ce92800\": rpc error: code = NotFound desc = could not find container \"e2b77e2267fe204a89c641d60d263440188dd11b1536b9062ded3cf26ce92800\": container with ID starting with e2b77e2267fe204a89c641d60d263440188dd11b1536b9062ded3cf26ce92800 not found: ID does not exist" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.231986 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.264659 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4902a1ee-5b54-48bd-b8fb-8be63db315a5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.264759 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4902a1ee-5b54-48bd-b8fb-8be63db315a5-config-data\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.264807 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4902a1ee-5b54-48bd-b8fb-8be63db315a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.264830 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4902a1ee-5b54-48bd-b8fb-8be63db315a5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.264853 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4902a1ee-5b54-48bd-b8fb-8be63db315a5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.264896 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4902a1ee-5b54-48bd-b8fb-8be63db315a5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.264941 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4902a1ee-5b54-48bd-b8fb-8be63db315a5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.264994 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4902a1ee-5b54-48bd-b8fb-8be63db315a5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.265058 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4902a1ee-5b54-48bd-b8fb-8be63db315a5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.265111 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.265178 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8d77\" (UniqueName: \"kubernetes.io/projected/4902a1ee-5b54-48bd-b8fb-8be63db315a5-kube-api-access-g8d77\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.267011 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4902a1ee-5b54-48bd-b8fb-8be63db315a5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.267749 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4902a1ee-5b54-48bd-b8fb-8be63db315a5-config-data\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.268038 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4902a1ee-5b54-48bd-b8fb-8be63db315a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.271610 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4902a1ee-5b54-48bd-b8fb-8be63db315a5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.272586 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4902a1ee-5b54-48bd-b8fb-8be63db315a5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.272930 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.273909 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4902a1ee-5b54-48bd-b8fb-8be63db315a5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.275950 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4902a1ee-5b54-48bd-b8fb-8be63db315a5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.283697 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4902a1ee-5b54-48bd-b8fb-8be63db315a5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.291094 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4902a1ee-5b54-48bd-b8fb-8be63db315a5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.293852 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8d77\" (UniqueName: \"kubernetes.io/projected/4902a1ee-5b54-48bd-b8fb-8be63db315a5-kube-api-access-g8d77\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.299920 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.310704 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.313152 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.316347 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rs8js" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.316511 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.316822 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.321531 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.323319 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.323511 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.324621 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.332889 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.345520 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"4902a1ee-5b54-48bd-b8fb-8be63db315a5\") " pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.458212 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-5pdvt"] Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.467976 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/99ac760c-1287-4674-9133-ee9124e9fbbd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.468025 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/99ac760c-1287-4674-9133-ee9124e9fbbd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.468076 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99ac760c-1287-4674-9133-ee9124e9fbbd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.468108 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swdz4\" (UniqueName: \"kubernetes.io/projected/99ac760c-1287-4674-9133-ee9124e9fbbd-kube-api-access-swdz4\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.468251 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/99ac760c-1287-4674-9133-ee9124e9fbbd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.468386 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.468497 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/99ac760c-1287-4674-9133-ee9124e9fbbd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.468580 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/99ac760c-1287-4674-9133-ee9124e9fbbd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.468744 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/99ac760c-1287-4674-9133-ee9124e9fbbd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.468827 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/99ac760c-1287-4674-9133-ee9124e9fbbd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.468973 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/99ac760c-1287-4674-9133-ee9124e9fbbd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.570248 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/99ac760c-1287-4674-9133-ee9124e9fbbd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.570412 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/99ac760c-1287-4674-9133-ee9124e9fbbd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.570482 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/99ac760c-1287-4674-9133-ee9124e9fbbd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.570582 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/99ac760c-1287-4674-9133-ee9124e9fbbd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.570675 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/99ac760c-1287-4674-9133-ee9124e9fbbd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.570749 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/99ac760c-1287-4674-9133-ee9124e9fbbd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.570820 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99ac760c-1287-4674-9133-ee9124e9fbbd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.570884 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swdz4\" (UniqueName: \"kubernetes.io/projected/99ac760c-1287-4674-9133-ee9124e9fbbd-kube-api-access-swdz4\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.570945 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/99ac760c-1287-4674-9133-ee9124e9fbbd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.571010 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.571074 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/99ac760c-1287-4674-9133-ee9124e9fbbd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.571377 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/99ac760c-1287-4674-9133-ee9124e9fbbd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.571583 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/99ac760c-1287-4674-9133-ee9124e9fbbd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.571874 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/99ac760c-1287-4674-9133-ee9124e9fbbd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.572280 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/99ac760c-1287-4674-9133-ee9124e9fbbd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.573511 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99ac760c-1287-4674-9133-ee9124e9fbbd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.573518 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.576437 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/99ac760c-1287-4674-9133-ee9124e9fbbd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.576975 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/99ac760c-1287-4674-9133-ee9124e9fbbd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.577398 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/99ac760c-1287-4674-9133-ee9124e9fbbd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.589422 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swdz4\" (UniqueName: \"kubernetes.io/projected/99ac760c-1287-4674-9133-ee9124e9fbbd-kube-api-access-swdz4\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.592537 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/99ac760c-1287-4674-9133-ee9124e9fbbd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.598644 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"99ac760c-1287-4674-9133-ee9124e9fbbd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.644853 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.666770 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.909793 4783 generic.go:334] "Generic (PLEG): container finished" podID="e3e01e6e-b0e5-4746-b235-85dde1fc7084" containerID="8c0ed498abc68cd5867226ba824ffb7a5c7493bd7457ab767def16f82b3f848d" exitCode=0 Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.909850 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" event={"ID":"e3e01e6e-b0e5-4746-b235-85dde1fc7084","Type":"ContainerDied","Data":"8c0ed498abc68cd5867226ba824ffb7a5c7493bd7457ab767def16f82b3f848d"} Jan 31 09:22:08 crc kubenswrapper[4783]: I0131 09:22:08.909875 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" event={"ID":"e3e01e6e-b0e5-4746-b235-85dde1fc7084","Type":"ContainerStarted","Data":"ab13e52a1b02dae3aaf1a7b7e9a87fdb5602b1e5d6b985da1238d78f68ad7f76"} Jan 31 09:22:09 crc kubenswrapper[4783]: I0131 09:22:09.059841 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:22:09 crc kubenswrapper[4783]: W0131 09:22:09.066748 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4902a1ee_5b54_48bd_b8fb_8be63db315a5.slice/crio-6d85d9813d1556daf22b195f78affb3a0bbc6a92fe76a22cb5635ce8efb86b9f WatchSource:0}: Error finding container 6d85d9813d1556daf22b195f78affb3a0bbc6a92fe76a22cb5635ce8efb86b9f: Status 404 returned error can't find the container with id 6d85d9813d1556daf22b195f78affb3a0bbc6a92fe76a22cb5635ce8efb86b9f Jan 31 09:22:09 crc kubenswrapper[4783]: I0131 09:22:09.175690 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:22:09 crc kubenswrapper[4783]: I0131 09:22:09.656588 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa1eeb1-d389-4933-a40b-3383b28597c2" path="/var/lib/kubelet/pods/1aa1eeb1-d389-4933-a40b-3383b28597c2/volumes" Jan 31 09:22:09 crc kubenswrapper[4783]: I0131 09:22:09.657800 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e44f3996-11b5-4095-a1f3-e1bc24974386" path="/var/lib/kubelet/pods/e44f3996-11b5-4095-a1f3-e1bc24974386/volumes" Jan 31 09:22:09 crc kubenswrapper[4783]: I0131 09:22:09.923436 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" event={"ID":"e3e01e6e-b0e5-4746-b235-85dde1fc7084","Type":"ContainerStarted","Data":"1dab9e120ed7b552b60dde377e6b5c637a31d3c6ec4c8d940d1c3e98d473398e"} Jan 31 09:22:09 crc kubenswrapper[4783]: I0131 09:22:09.923558 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:09 crc kubenswrapper[4783]: I0131 09:22:09.926253 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"99ac760c-1287-4674-9133-ee9124e9fbbd","Type":"ContainerStarted","Data":"ccf6c55bd8f0d05db47b51f5cdb5204548cfa7d29cb18a8a7b044ffe756988c7"} Jan 31 09:22:09 crc kubenswrapper[4783]: I0131 09:22:09.927472 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4902a1ee-5b54-48bd-b8fb-8be63db315a5","Type":"ContainerStarted","Data":"6d85d9813d1556daf22b195f78affb3a0bbc6a92fe76a22cb5635ce8efb86b9f"} Jan 31 09:22:09 crc kubenswrapper[4783]: I0131 09:22:09.950284 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" podStartSLOduration=2.950263784 podStartE2EDuration="2.950263784s" podCreationTimestamp="2026-01-31 09:22:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:22:09.939346313 +0000 UTC m=+1040.608029781" watchObservedRunningTime="2026-01-31 09:22:09.950263784 +0000 UTC m=+1040.618947252" Jan 31 09:22:10 crc kubenswrapper[4783]: I0131 09:22:10.936380 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"99ac760c-1287-4674-9133-ee9124e9fbbd","Type":"ContainerStarted","Data":"1840ade74a55d8c9fdc8d48c61b0cb02b0281d6d171859c18d6a58db479a1bfe"} Jan 31 09:22:10 crc kubenswrapper[4783]: I0131 09:22:10.938479 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4902a1ee-5b54-48bd-b8fb-8be63db315a5","Type":"ContainerStarted","Data":"c1651a0f7d10ca9735d81ac0caf8ce27ce56f4b21e6d252eb4ebc42c8feefe1d"} Jan 31 09:22:17 crc kubenswrapper[4783]: I0131 09:22:17.921344 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:17 crc kubenswrapper[4783]: I0131 09:22:17.979470 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-c7hpn"] Jan 31 09:22:17 crc kubenswrapper[4783]: I0131 09:22:17.979716 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" podUID="6c8f0127-fb81-4060-8fc5-e12eab702218" containerName="dnsmasq-dns" containerID="cri-o://49d1fd28ed1a6a6b754b44c5b1c4e34681bfe9893dcccfcdce69e5dc37d8d5bc" gracePeriod=10 Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.103742 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66fc59ccbf-m8kdr"] Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.113131 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.117205 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66fc59ccbf-m8kdr"] Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.188400 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-dns-swift-storage-0\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.188561 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-config\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.188824 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-dns-svc\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.188874 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-ovsdbserver-sb\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.188940 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-ovsdbserver-nb\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.188975 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjx8m\" (UniqueName: \"kubernetes.io/projected/19000338-c242-44b7-a9e2-1a0c0c15f58b-kube-api-access-cjx8m\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.189087 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-openstack-edpm-ipam\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.291264 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-dns-swift-storage-0\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.291340 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-config\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.291413 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-dns-svc\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.291436 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-ovsdbserver-sb\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.291464 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-ovsdbserver-nb\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.291486 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjx8m\" (UniqueName: \"kubernetes.io/projected/19000338-c242-44b7-a9e2-1a0c0c15f58b-kube-api-access-cjx8m\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.291532 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-openstack-edpm-ipam\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.292718 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-openstack-edpm-ipam\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.293247 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-dns-svc\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.293948 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-dns-swift-storage-0\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.294336 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-ovsdbserver-nb\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.294718 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-config\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.294854 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19000338-c242-44b7-a9e2-1a0c0c15f58b-ovsdbserver-sb\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.329245 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjx8m\" (UniqueName: \"kubernetes.io/projected/19000338-c242-44b7-a9e2-1a0c0c15f58b-kube-api-access-cjx8m\") pod \"dnsmasq-dns-66fc59ccbf-m8kdr\" (UID: \"19000338-c242-44b7-a9e2-1a0c0c15f58b\") " pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.435445 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.436322 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.495239 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-ovsdbserver-sb\") pod \"6c8f0127-fb81-4060-8fc5-e12eab702218\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.495570 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6zb5\" (UniqueName: \"kubernetes.io/projected/6c8f0127-fb81-4060-8fc5-e12eab702218-kube-api-access-p6zb5\") pod \"6c8f0127-fb81-4060-8fc5-e12eab702218\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.495718 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-config\") pod \"6c8f0127-fb81-4060-8fc5-e12eab702218\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.495827 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-ovsdbserver-nb\") pod \"6c8f0127-fb81-4060-8fc5-e12eab702218\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.495984 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-dns-svc\") pod \"6c8f0127-fb81-4060-8fc5-e12eab702218\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.496022 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-dns-swift-storage-0\") pod \"6c8f0127-fb81-4060-8fc5-e12eab702218\" (UID: \"6c8f0127-fb81-4060-8fc5-e12eab702218\") " Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.502762 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c8f0127-fb81-4060-8fc5-e12eab702218-kube-api-access-p6zb5" (OuterVolumeSpecName: "kube-api-access-p6zb5") pod "6c8f0127-fb81-4060-8fc5-e12eab702218" (UID: "6c8f0127-fb81-4060-8fc5-e12eab702218"). InnerVolumeSpecName "kube-api-access-p6zb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.534020 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c8f0127-fb81-4060-8fc5-e12eab702218" (UID: "6c8f0127-fb81-4060-8fc5-e12eab702218"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.535100 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6c8f0127-fb81-4060-8fc5-e12eab702218" (UID: "6c8f0127-fb81-4060-8fc5-e12eab702218"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.539244 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-config" (OuterVolumeSpecName: "config") pod "6c8f0127-fb81-4060-8fc5-e12eab702218" (UID: "6c8f0127-fb81-4060-8fc5-e12eab702218"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.549458 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6c8f0127-fb81-4060-8fc5-e12eab702218" (UID: "6c8f0127-fb81-4060-8fc5-e12eab702218"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.556570 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6c8f0127-fb81-4060-8fc5-e12eab702218" (UID: "6c8f0127-fb81-4060-8fc5-e12eab702218"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.599027 4783 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.599066 4783 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.599078 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.599089 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6zb5\" (UniqueName: \"kubernetes.io/projected/6c8f0127-fb81-4060-8fc5-e12eab702218-kube-api-access-p6zb5\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.599100 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.599108 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c8f0127-fb81-4060-8fc5-e12eab702218-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:18 crc kubenswrapper[4783]: I0131 09:22:18.895353 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66fc59ccbf-m8kdr"] Jan 31 09:22:19 crc kubenswrapper[4783]: I0131 09:22:19.029746 4783 generic.go:334] "Generic (PLEG): container finished" podID="6c8f0127-fb81-4060-8fc5-e12eab702218" containerID="49d1fd28ed1a6a6b754b44c5b1c4e34681bfe9893dcccfcdce69e5dc37d8d5bc" exitCode=0 Jan 31 09:22:19 crc kubenswrapper[4783]: I0131 09:22:19.029855 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" event={"ID":"6c8f0127-fb81-4060-8fc5-e12eab702218","Type":"ContainerDied","Data":"49d1fd28ed1a6a6b754b44c5b1c4e34681bfe9893dcccfcdce69e5dc37d8d5bc"} Jan 31 09:22:19 crc kubenswrapper[4783]: I0131 09:22:19.029867 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" Jan 31 09:22:19 crc kubenswrapper[4783]: I0131 09:22:19.029904 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-c7hpn" event={"ID":"6c8f0127-fb81-4060-8fc5-e12eab702218","Type":"ContainerDied","Data":"3478164d633f631ea8d7f237c012ee310500242f92dd7fb929bba3731a7eb9c9"} Jan 31 09:22:19 crc kubenswrapper[4783]: I0131 09:22:19.029930 4783 scope.go:117] "RemoveContainer" containerID="49d1fd28ed1a6a6b754b44c5b1c4e34681bfe9893dcccfcdce69e5dc37d8d5bc" Jan 31 09:22:19 crc kubenswrapper[4783]: I0131 09:22:19.031064 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" event={"ID":"19000338-c242-44b7-a9e2-1a0c0c15f58b","Type":"ContainerStarted","Data":"727f8055684b21cd28cd64cbb0a703943f46fd5679d585ad7878139be66c3948"} Jan 31 09:22:19 crc kubenswrapper[4783]: I0131 09:22:19.063585 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-c7hpn"] Jan 31 09:22:19 crc kubenswrapper[4783]: I0131 09:22:19.065153 4783 scope.go:117] "RemoveContainer" containerID="1871cc6dc85787b0822a5ada75ec97544e1284ccf9f9db8eb7de37ac82024e20" Jan 31 09:22:19 crc kubenswrapper[4783]: I0131 09:22:19.074150 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-c7hpn"] Jan 31 09:22:19 crc kubenswrapper[4783]: I0131 09:22:19.093754 4783 scope.go:117] "RemoveContainer" containerID="49d1fd28ed1a6a6b754b44c5b1c4e34681bfe9893dcccfcdce69e5dc37d8d5bc" Jan 31 09:22:19 crc kubenswrapper[4783]: E0131 09:22:19.094131 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49d1fd28ed1a6a6b754b44c5b1c4e34681bfe9893dcccfcdce69e5dc37d8d5bc\": container with ID starting with 49d1fd28ed1a6a6b754b44c5b1c4e34681bfe9893dcccfcdce69e5dc37d8d5bc not found: ID does not exist" containerID="49d1fd28ed1a6a6b754b44c5b1c4e34681bfe9893dcccfcdce69e5dc37d8d5bc" Jan 31 09:22:19 crc kubenswrapper[4783]: I0131 09:22:19.094187 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49d1fd28ed1a6a6b754b44c5b1c4e34681bfe9893dcccfcdce69e5dc37d8d5bc"} err="failed to get container status \"49d1fd28ed1a6a6b754b44c5b1c4e34681bfe9893dcccfcdce69e5dc37d8d5bc\": rpc error: code = NotFound desc = could not find container \"49d1fd28ed1a6a6b754b44c5b1c4e34681bfe9893dcccfcdce69e5dc37d8d5bc\": container with ID starting with 49d1fd28ed1a6a6b754b44c5b1c4e34681bfe9893dcccfcdce69e5dc37d8d5bc not found: ID does not exist" Jan 31 09:22:19 crc kubenswrapper[4783]: I0131 09:22:19.094211 4783 scope.go:117] "RemoveContainer" containerID="1871cc6dc85787b0822a5ada75ec97544e1284ccf9f9db8eb7de37ac82024e20" Jan 31 09:22:19 crc kubenswrapper[4783]: E0131 09:22:19.094545 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1871cc6dc85787b0822a5ada75ec97544e1284ccf9f9db8eb7de37ac82024e20\": container with ID starting with 1871cc6dc85787b0822a5ada75ec97544e1284ccf9f9db8eb7de37ac82024e20 not found: ID does not exist" containerID="1871cc6dc85787b0822a5ada75ec97544e1284ccf9f9db8eb7de37ac82024e20" Jan 31 09:22:19 crc kubenswrapper[4783]: I0131 09:22:19.094584 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1871cc6dc85787b0822a5ada75ec97544e1284ccf9f9db8eb7de37ac82024e20"} err="failed to get container status \"1871cc6dc85787b0822a5ada75ec97544e1284ccf9f9db8eb7de37ac82024e20\": rpc error: code = NotFound desc = could not find container \"1871cc6dc85787b0822a5ada75ec97544e1284ccf9f9db8eb7de37ac82024e20\": container with ID starting with 1871cc6dc85787b0822a5ada75ec97544e1284ccf9f9db8eb7de37ac82024e20 not found: ID does not exist" Jan 31 09:22:19 crc kubenswrapper[4783]: I0131 09:22:19.654439 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c8f0127-fb81-4060-8fc5-e12eab702218" path="/var/lib/kubelet/pods/6c8f0127-fb81-4060-8fc5-e12eab702218/volumes" Jan 31 09:22:20 crc kubenswrapper[4783]: I0131 09:22:20.043974 4783 generic.go:334] "Generic (PLEG): container finished" podID="19000338-c242-44b7-a9e2-1a0c0c15f58b" containerID="895609fb541d470738cd720df54e24994070222a848b3e47c67168af5c3ba42e" exitCode=0 Jan 31 09:22:20 crc kubenswrapper[4783]: I0131 09:22:20.044022 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" event={"ID":"19000338-c242-44b7-a9e2-1a0c0c15f58b","Type":"ContainerDied","Data":"895609fb541d470738cd720df54e24994070222a848b3e47c67168af5c3ba42e"} Jan 31 09:22:21 crc kubenswrapper[4783]: I0131 09:22:21.056043 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" event={"ID":"19000338-c242-44b7-a9e2-1a0c0c15f58b","Type":"ContainerStarted","Data":"0721240b33245a1d8cd56675ad8723a8787bf30ab05ace379cfd35f3b4229e2e"} Jan 31 09:22:21 crc kubenswrapper[4783]: I0131 09:22:21.056535 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:21 crc kubenswrapper[4783]: I0131 09:22:21.075458 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" podStartSLOduration=3.075437849 podStartE2EDuration="3.075437849s" podCreationTimestamp="2026-01-31 09:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:22:21.071101144 +0000 UTC m=+1051.739784612" watchObservedRunningTime="2026-01-31 09:22:21.075437849 +0000 UTC m=+1051.744121308" Jan 31 09:22:28 crc kubenswrapper[4783]: I0131 09:22:28.437337 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66fc59ccbf-m8kdr" Jan 31 09:22:28 crc kubenswrapper[4783]: I0131 09:22:28.485186 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-5pdvt"] Jan 31 09:22:28 crc kubenswrapper[4783]: I0131 09:22:28.485380 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" podUID="e3e01e6e-b0e5-4746-b235-85dde1fc7084" containerName="dnsmasq-dns" containerID="cri-o://1dab9e120ed7b552b60dde377e6b5c637a31d3c6ec4c8d940d1c3e98d473398e" gracePeriod=10 Jan 31 09:22:28 crc kubenswrapper[4783]: I0131 09:22:28.901885 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.005683 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-dns-svc\") pod \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.005820 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-dns-swift-storage-0\") pod \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.005865 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-ovsdbserver-sb\") pod \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.005923 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smblk\" (UniqueName: \"kubernetes.io/projected/e3e01e6e-b0e5-4746-b235-85dde1fc7084-kube-api-access-smblk\") pod \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.005947 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-config\") pod \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.006031 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-ovsdbserver-nb\") pod \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.006067 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-openstack-edpm-ipam\") pod \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\" (UID: \"e3e01e6e-b0e5-4746-b235-85dde1fc7084\") " Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.011701 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e01e6e-b0e5-4746-b235-85dde1fc7084-kube-api-access-smblk" (OuterVolumeSpecName: "kube-api-access-smblk") pod "e3e01e6e-b0e5-4746-b235-85dde1fc7084" (UID: "e3e01e6e-b0e5-4746-b235-85dde1fc7084"). InnerVolumeSpecName "kube-api-access-smblk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.045280 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e3e01e6e-b0e5-4746-b235-85dde1fc7084" (UID: "e3e01e6e-b0e5-4746-b235-85dde1fc7084"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.046886 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "e3e01e6e-b0e5-4746-b235-85dde1fc7084" (UID: "e3e01e6e-b0e5-4746-b235-85dde1fc7084"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.047028 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e3e01e6e-b0e5-4746-b235-85dde1fc7084" (UID: "e3e01e6e-b0e5-4746-b235-85dde1fc7084"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.047533 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-config" (OuterVolumeSpecName: "config") pod "e3e01e6e-b0e5-4746-b235-85dde1fc7084" (UID: "e3e01e6e-b0e5-4746-b235-85dde1fc7084"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.051982 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e3e01e6e-b0e5-4746-b235-85dde1fc7084" (UID: "e3e01e6e-b0e5-4746-b235-85dde1fc7084"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.057183 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e3e01e6e-b0e5-4746-b235-85dde1fc7084" (UID: "e3e01e6e-b0e5-4746-b235-85dde1fc7084"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.110152 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.110207 4783 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.110220 4783 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.110233 4783 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.110243 4783 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.110252 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smblk\" (UniqueName: \"kubernetes.io/projected/e3e01e6e-b0e5-4746-b235-85dde1fc7084-kube-api-access-smblk\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.110262 4783 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3e01e6e-b0e5-4746-b235-85dde1fc7084-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.118437 4783 generic.go:334] "Generic (PLEG): container finished" podID="e3e01e6e-b0e5-4746-b235-85dde1fc7084" containerID="1dab9e120ed7b552b60dde377e6b5c637a31d3c6ec4c8d940d1c3e98d473398e" exitCode=0 Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.118485 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" event={"ID":"e3e01e6e-b0e5-4746-b235-85dde1fc7084","Type":"ContainerDied","Data":"1dab9e120ed7b552b60dde377e6b5c637a31d3c6ec4c8d940d1c3e98d473398e"} Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.118513 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" event={"ID":"e3e01e6e-b0e5-4746-b235-85dde1fc7084","Type":"ContainerDied","Data":"ab13e52a1b02dae3aaf1a7b7e9a87fdb5602b1e5d6b985da1238d78f68ad7f76"} Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.118532 4783 scope.go:117] "RemoveContainer" containerID="1dab9e120ed7b552b60dde377e6b5c637a31d3c6ec4c8d940d1c3e98d473398e" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.118652 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668b55cdd7-5pdvt" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.141597 4783 scope.go:117] "RemoveContainer" containerID="8c0ed498abc68cd5867226ba824ffb7a5c7493bd7457ab767def16f82b3f848d" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.146924 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-5pdvt"] Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.153811 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-5pdvt"] Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.176357 4783 scope.go:117] "RemoveContainer" containerID="1dab9e120ed7b552b60dde377e6b5c637a31d3c6ec4c8d940d1c3e98d473398e" Jan 31 09:22:29 crc kubenswrapper[4783]: E0131 09:22:29.176691 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dab9e120ed7b552b60dde377e6b5c637a31d3c6ec4c8d940d1c3e98d473398e\": container with ID starting with 1dab9e120ed7b552b60dde377e6b5c637a31d3c6ec4c8d940d1c3e98d473398e not found: ID does not exist" containerID="1dab9e120ed7b552b60dde377e6b5c637a31d3c6ec4c8d940d1c3e98d473398e" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.176728 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dab9e120ed7b552b60dde377e6b5c637a31d3c6ec4c8d940d1c3e98d473398e"} err="failed to get container status \"1dab9e120ed7b552b60dde377e6b5c637a31d3c6ec4c8d940d1c3e98d473398e\": rpc error: code = NotFound desc = could not find container \"1dab9e120ed7b552b60dde377e6b5c637a31d3c6ec4c8d940d1c3e98d473398e\": container with ID starting with 1dab9e120ed7b552b60dde377e6b5c637a31d3c6ec4c8d940d1c3e98d473398e not found: ID does not exist" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.176748 4783 scope.go:117] "RemoveContainer" containerID="8c0ed498abc68cd5867226ba824ffb7a5c7493bd7457ab767def16f82b3f848d" Jan 31 09:22:29 crc kubenswrapper[4783]: E0131 09:22:29.177004 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c0ed498abc68cd5867226ba824ffb7a5c7493bd7457ab767def16f82b3f848d\": container with ID starting with 8c0ed498abc68cd5867226ba824ffb7a5c7493bd7457ab767def16f82b3f848d not found: ID does not exist" containerID="8c0ed498abc68cd5867226ba824ffb7a5c7493bd7457ab767def16f82b3f848d" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.177031 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c0ed498abc68cd5867226ba824ffb7a5c7493bd7457ab767def16f82b3f848d"} err="failed to get container status \"8c0ed498abc68cd5867226ba824ffb7a5c7493bd7457ab767def16f82b3f848d\": rpc error: code = NotFound desc = could not find container \"8c0ed498abc68cd5867226ba824ffb7a5c7493bd7457ab767def16f82b3f848d\": container with ID starting with 8c0ed498abc68cd5867226ba824ffb7a5c7493bd7457ab767def16f82b3f848d not found: ID does not exist" Jan 31 09:22:29 crc kubenswrapper[4783]: I0131 09:22:29.655416 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3e01e6e-b0e5-4746-b235-85dde1fc7084" path="/var/lib/kubelet/pods/e3e01e6e-b0e5-4746-b235-85dde1fc7084/volumes" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.616472 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29"] Jan 31 09:22:41 crc kubenswrapper[4783]: E0131 09:22:41.617851 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e01e6e-b0e5-4746-b235-85dde1fc7084" containerName="init" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.617868 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e01e6e-b0e5-4746-b235-85dde1fc7084" containerName="init" Jan 31 09:22:41 crc kubenswrapper[4783]: E0131 09:22:41.617895 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c8f0127-fb81-4060-8fc5-e12eab702218" containerName="dnsmasq-dns" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.617901 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8f0127-fb81-4060-8fc5-e12eab702218" containerName="dnsmasq-dns" Jan 31 09:22:41 crc kubenswrapper[4783]: E0131 09:22:41.617916 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e01e6e-b0e5-4746-b235-85dde1fc7084" containerName="dnsmasq-dns" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.617923 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e01e6e-b0e5-4746-b235-85dde1fc7084" containerName="dnsmasq-dns" Jan 31 09:22:41 crc kubenswrapper[4783]: E0131 09:22:41.617969 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c8f0127-fb81-4060-8fc5-e12eab702218" containerName="init" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.617975 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8f0127-fb81-4060-8fc5-e12eab702218" containerName="init" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.618220 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e01e6e-b0e5-4746-b235-85dde1fc7084" containerName="dnsmasq-dns" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.618246 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c8f0127-fb81-4060-8fc5-e12eab702218" containerName="dnsmasq-dns" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.619103 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.622247 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.622247 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.625564 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.626637 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29"] Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.627334 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.637194 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29\" (UID: \"f3679d22-7479-40f7-9c8b-2e0caa156965\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.637302 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29\" (UID: \"f3679d22-7479-40f7-9c8b-2e0caa156965\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.637338 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29\" (UID: \"f3679d22-7479-40f7-9c8b-2e0caa156965\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.637369 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbbmp\" (UniqueName: \"kubernetes.io/projected/f3679d22-7479-40f7-9c8b-2e0caa156965-kube-api-access-wbbmp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29\" (UID: \"f3679d22-7479-40f7-9c8b-2e0caa156965\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.739693 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29\" (UID: \"f3679d22-7479-40f7-9c8b-2e0caa156965\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.739792 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29\" (UID: \"f3679d22-7479-40f7-9c8b-2e0caa156965\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.739881 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbbmp\" (UniqueName: \"kubernetes.io/projected/f3679d22-7479-40f7-9c8b-2e0caa156965-kube-api-access-wbbmp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29\" (UID: \"f3679d22-7479-40f7-9c8b-2e0caa156965\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.740275 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29\" (UID: \"f3679d22-7479-40f7-9c8b-2e0caa156965\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.747679 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29\" (UID: \"f3679d22-7479-40f7-9c8b-2e0caa156965\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.749088 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29\" (UID: \"f3679d22-7479-40f7-9c8b-2e0caa156965\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.749800 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29\" (UID: \"f3679d22-7479-40f7-9c8b-2e0caa156965\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.754059 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbbmp\" (UniqueName: \"kubernetes.io/projected/f3679d22-7479-40f7-9c8b-2e0caa156965-kube-api-access-wbbmp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29\" (UID: \"f3679d22-7479-40f7-9c8b-2e0caa156965\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" Jan 31 09:22:41 crc kubenswrapper[4783]: I0131 09:22:41.937649 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" Jan 31 09:22:42 crc kubenswrapper[4783]: I0131 09:22:42.246648 4783 generic.go:334] "Generic (PLEG): container finished" podID="4902a1ee-5b54-48bd-b8fb-8be63db315a5" containerID="c1651a0f7d10ca9735d81ac0caf8ce27ce56f4b21e6d252eb4ebc42c8feefe1d" exitCode=0 Jan 31 09:22:42 crc kubenswrapper[4783]: I0131 09:22:42.246743 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4902a1ee-5b54-48bd-b8fb-8be63db315a5","Type":"ContainerDied","Data":"c1651a0f7d10ca9735d81ac0caf8ce27ce56f4b21e6d252eb4ebc42c8feefe1d"} Jan 31 09:22:42 crc kubenswrapper[4783]: I0131 09:22:42.398150 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29"] Jan 31 09:22:42 crc kubenswrapper[4783]: I0131 09:22:42.399232 4783 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:22:43 crc kubenswrapper[4783]: I0131 09:22:43.255620 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" event={"ID":"f3679d22-7479-40f7-9c8b-2e0caa156965","Type":"ContainerStarted","Data":"4b2c48e2f8ce8b0c0b29d6bb6cda6835af270ac3a098eebabe0845daa552e2ef"} Jan 31 09:22:43 crc kubenswrapper[4783]: I0131 09:22:43.257083 4783 generic.go:334] "Generic (PLEG): container finished" podID="99ac760c-1287-4674-9133-ee9124e9fbbd" containerID="1840ade74a55d8c9fdc8d48c61b0cb02b0281d6d171859c18d6a58db479a1bfe" exitCode=0 Jan 31 09:22:43 crc kubenswrapper[4783]: I0131 09:22:43.257196 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"99ac760c-1287-4674-9133-ee9124e9fbbd","Type":"ContainerDied","Data":"1840ade74a55d8c9fdc8d48c61b0cb02b0281d6d171859c18d6a58db479a1bfe"} Jan 31 09:22:43 crc kubenswrapper[4783]: I0131 09:22:43.259391 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4902a1ee-5b54-48bd-b8fb-8be63db315a5","Type":"ContainerStarted","Data":"7d85a288c0a677b7c864db814d9549e6cf59f29feaae558fc7bd024b3227d831"} Jan 31 09:22:43 crc kubenswrapper[4783]: I0131 09:22:43.259544 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 31 09:22:43 crc kubenswrapper[4783]: I0131 09:22:43.311656 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.311636414 podStartE2EDuration="36.311636414s" podCreationTimestamp="2026-01-31 09:22:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:22:43.305579916 +0000 UTC m=+1073.974263385" watchObservedRunningTime="2026-01-31 09:22:43.311636414 +0000 UTC m=+1073.980319882" Jan 31 09:22:44 crc kubenswrapper[4783]: I0131 09:22:44.271982 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"99ac760c-1287-4674-9133-ee9124e9fbbd","Type":"ContainerStarted","Data":"7495c6209518685f5dc35da5b4cb953b41219b31696d74e91a469b462215aeba"} Jan 31 09:22:44 crc kubenswrapper[4783]: I0131 09:22:44.272551 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:44 crc kubenswrapper[4783]: I0131 09:22:44.296857 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.296841055 podStartE2EDuration="36.296841055s" podCreationTimestamp="2026-01-31 09:22:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:22:44.294532944 +0000 UTC m=+1074.963216412" watchObservedRunningTime="2026-01-31 09:22:44.296841055 +0000 UTC m=+1074.965524523" Jan 31 09:22:50 crc kubenswrapper[4783]: I0131 09:22:50.592407 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:22:51 crc kubenswrapper[4783]: I0131 09:22:51.357343 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" event={"ID":"f3679d22-7479-40f7-9c8b-2e0caa156965","Type":"ContainerStarted","Data":"9b26e20517a0e47e6fc28ed569b4797fbc03a74390adf571b6e0ad12656ec564"} Jan 31 09:22:58 crc kubenswrapper[4783]: I0131 09:22:58.649345 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 31 09:22:58 crc kubenswrapper[4783]: I0131 09:22:58.669792 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:22:58 crc kubenswrapper[4783]: I0131 09:22:58.676784 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" podStartSLOduration=9.485588686 podStartE2EDuration="17.676765632s" podCreationTimestamp="2026-01-31 09:22:41 +0000 UTC" firstStartedPulling="2026-01-31 09:22:42.398970687 +0000 UTC m=+1073.067654154" lastFinishedPulling="2026-01-31 09:22:50.590147632 +0000 UTC m=+1081.258831100" observedRunningTime="2026-01-31 09:22:51.378236275 +0000 UTC m=+1082.046919753" watchObservedRunningTime="2026-01-31 09:22:58.676765632 +0000 UTC m=+1089.345449100" Jan 31 09:23:02 crc kubenswrapper[4783]: I0131 09:23:02.467395 4783 generic.go:334] "Generic (PLEG): container finished" podID="f3679d22-7479-40f7-9c8b-2e0caa156965" containerID="9b26e20517a0e47e6fc28ed569b4797fbc03a74390adf571b6e0ad12656ec564" exitCode=0 Jan 31 09:23:02 crc kubenswrapper[4783]: I0131 09:23:02.467476 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" event={"ID":"f3679d22-7479-40f7-9c8b-2e0caa156965","Type":"ContainerDied","Data":"9b26e20517a0e47e6fc28ed569b4797fbc03a74390adf571b6e0ad12656ec564"} Jan 31 09:23:03 crc kubenswrapper[4783]: I0131 09:23:03.820265 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" Jan 31 09:23:03 crc kubenswrapper[4783]: I0131 09:23:03.992300 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbbmp\" (UniqueName: \"kubernetes.io/projected/f3679d22-7479-40f7-9c8b-2e0caa156965-kube-api-access-wbbmp\") pod \"f3679d22-7479-40f7-9c8b-2e0caa156965\" (UID: \"f3679d22-7479-40f7-9c8b-2e0caa156965\") " Jan 31 09:23:03 crc kubenswrapper[4783]: I0131 09:23:03.992427 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-ssh-key-openstack-edpm-ipam\") pod \"f3679d22-7479-40f7-9c8b-2e0caa156965\" (UID: \"f3679d22-7479-40f7-9c8b-2e0caa156965\") " Jan 31 09:23:03 crc kubenswrapper[4783]: I0131 09:23:03.992638 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-inventory\") pod \"f3679d22-7479-40f7-9c8b-2e0caa156965\" (UID: \"f3679d22-7479-40f7-9c8b-2e0caa156965\") " Jan 31 09:23:03 crc kubenswrapper[4783]: I0131 09:23:03.992722 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-repo-setup-combined-ca-bundle\") pod \"f3679d22-7479-40f7-9c8b-2e0caa156965\" (UID: \"f3679d22-7479-40f7-9c8b-2e0caa156965\") " Jan 31 09:23:03 crc kubenswrapper[4783]: I0131 09:23:03.998963 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "f3679d22-7479-40f7-9c8b-2e0caa156965" (UID: "f3679d22-7479-40f7-9c8b-2e0caa156965"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:03 crc kubenswrapper[4783]: I0131 09:23:03.999303 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3679d22-7479-40f7-9c8b-2e0caa156965-kube-api-access-wbbmp" (OuterVolumeSpecName: "kube-api-access-wbbmp") pod "f3679d22-7479-40f7-9c8b-2e0caa156965" (UID: "f3679d22-7479-40f7-9c8b-2e0caa156965"). InnerVolumeSpecName "kube-api-access-wbbmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.018397 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f3679d22-7479-40f7-9c8b-2e0caa156965" (UID: "f3679d22-7479-40f7-9c8b-2e0caa156965"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.019022 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-inventory" (OuterVolumeSpecName: "inventory") pod "f3679d22-7479-40f7-9c8b-2e0caa156965" (UID: "f3679d22-7479-40f7-9c8b-2e0caa156965"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.095659 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.095700 4783 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.095715 4783 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3679d22-7479-40f7-9c8b-2e0caa156965-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.095729 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbbmp\" (UniqueName: \"kubernetes.io/projected/f3679d22-7479-40f7-9c8b-2e0caa156965-kube-api-access-wbbmp\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.492908 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" event={"ID":"f3679d22-7479-40f7-9c8b-2e0caa156965","Type":"ContainerDied","Data":"4b2c48e2f8ce8b0c0b29d6bb6cda6835af270ac3a098eebabe0845daa552e2ef"} Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.492963 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b2c48e2f8ce8b0c0b29d6bb6cda6835af270ac3a098eebabe0845daa552e2ef" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.493002 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.545782 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c"] Jan 31 09:23:04 crc kubenswrapper[4783]: E0131 09:23:04.546601 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3679d22-7479-40f7-9c8b-2e0caa156965" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.546622 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3679d22-7479-40f7-9c8b-2e0caa156965" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.546829 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3679d22-7479-40f7-9c8b-2e0caa156965" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.547559 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.550719 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.550922 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.551120 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.551376 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.570544 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c"] Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.608173 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrm48\" (UniqueName: \"kubernetes.io/projected/968fb64b-ed6e-492b-a508-41dd3dd98085-kube-api-access-vrm48\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-trw4c\" (UID: \"968fb64b-ed6e-492b-a508-41dd3dd98085\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.608348 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/968fb64b-ed6e-492b-a508-41dd3dd98085-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-trw4c\" (UID: \"968fb64b-ed6e-492b-a508-41dd3dd98085\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.608508 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/968fb64b-ed6e-492b-a508-41dd3dd98085-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-trw4c\" (UID: \"968fb64b-ed6e-492b-a508-41dd3dd98085\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.710873 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/968fb64b-ed6e-492b-a508-41dd3dd98085-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-trw4c\" (UID: \"968fb64b-ed6e-492b-a508-41dd3dd98085\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.710940 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrm48\" (UniqueName: \"kubernetes.io/projected/968fb64b-ed6e-492b-a508-41dd3dd98085-kube-api-access-vrm48\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-trw4c\" (UID: \"968fb64b-ed6e-492b-a508-41dd3dd98085\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.711083 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/968fb64b-ed6e-492b-a508-41dd3dd98085-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-trw4c\" (UID: \"968fb64b-ed6e-492b-a508-41dd3dd98085\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.716434 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/968fb64b-ed6e-492b-a508-41dd3dd98085-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-trw4c\" (UID: \"968fb64b-ed6e-492b-a508-41dd3dd98085\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.716645 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/968fb64b-ed6e-492b-a508-41dd3dd98085-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-trw4c\" (UID: \"968fb64b-ed6e-492b-a508-41dd3dd98085\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.726741 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrm48\" (UniqueName: \"kubernetes.io/projected/968fb64b-ed6e-492b-a508-41dd3dd98085-kube-api-access-vrm48\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-trw4c\" (UID: \"968fb64b-ed6e-492b-a508-41dd3dd98085\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" Jan 31 09:23:04 crc kubenswrapper[4783]: I0131 09:23:04.860580 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" Jan 31 09:23:05 crc kubenswrapper[4783]: I0131 09:23:05.323830 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c"] Jan 31 09:23:05 crc kubenswrapper[4783]: I0131 09:23:05.503518 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" event={"ID":"968fb64b-ed6e-492b-a508-41dd3dd98085","Type":"ContainerStarted","Data":"2aab1f5e73574e8c3060dfffde2ea17b84ce4151192e5eebece56e6cbe4739b1"} Jan 31 09:23:06 crc kubenswrapper[4783]: I0131 09:23:06.514752 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" event={"ID":"968fb64b-ed6e-492b-a508-41dd3dd98085","Type":"ContainerStarted","Data":"8e3af6e6d2d1ef646c1fdb0ec8694044105becf8e21bc7c289fdb078fe84fd39"} Jan 31 09:23:06 crc kubenswrapper[4783]: I0131 09:23:06.541463 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" podStartSLOduration=2.050647687 podStartE2EDuration="2.541445329s" podCreationTimestamp="2026-01-31 09:23:04 +0000 UTC" firstStartedPulling="2026-01-31 09:23:05.331435991 +0000 UTC m=+1096.000119460" lastFinishedPulling="2026-01-31 09:23:05.822233634 +0000 UTC m=+1096.490917102" observedRunningTime="2026-01-31 09:23:06.533501944 +0000 UTC m=+1097.202185412" watchObservedRunningTime="2026-01-31 09:23:06.541445329 +0000 UTC m=+1097.210128797" Jan 31 09:23:08 crc kubenswrapper[4783]: I0131 09:23:08.537649 4783 generic.go:334] "Generic (PLEG): container finished" podID="968fb64b-ed6e-492b-a508-41dd3dd98085" containerID="8e3af6e6d2d1ef646c1fdb0ec8694044105becf8e21bc7c289fdb078fe84fd39" exitCode=0 Jan 31 09:23:08 crc kubenswrapper[4783]: I0131 09:23:08.537739 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" event={"ID":"968fb64b-ed6e-492b-a508-41dd3dd98085","Type":"ContainerDied","Data":"8e3af6e6d2d1ef646c1fdb0ec8694044105becf8e21bc7c289fdb078fe84fd39"} Jan 31 09:23:09 crc kubenswrapper[4783]: I0131 09:23:09.903439 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.030106 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/968fb64b-ed6e-492b-a508-41dd3dd98085-inventory\") pod \"968fb64b-ed6e-492b-a508-41dd3dd98085\" (UID: \"968fb64b-ed6e-492b-a508-41dd3dd98085\") " Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.030213 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/968fb64b-ed6e-492b-a508-41dd3dd98085-ssh-key-openstack-edpm-ipam\") pod \"968fb64b-ed6e-492b-a508-41dd3dd98085\" (UID: \"968fb64b-ed6e-492b-a508-41dd3dd98085\") " Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.030251 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrm48\" (UniqueName: \"kubernetes.io/projected/968fb64b-ed6e-492b-a508-41dd3dd98085-kube-api-access-vrm48\") pod \"968fb64b-ed6e-492b-a508-41dd3dd98085\" (UID: \"968fb64b-ed6e-492b-a508-41dd3dd98085\") " Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.035830 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/968fb64b-ed6e-492b-a508-41dd3dd98085-kube-api-access-vrm48" (OuterVolumeSpecName: "kube-api-access-vrm48") pod "968fb64b-ed6e-492b-a508-41dd3dd98085" (UID: "968fb64b-ed6e-492b-a508-41dd3dd98085"). InnerVolumeSpecName "kube-api-access-vrm48". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.056550 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/968fb64b-ed6e-492b-a508-41dd3dd98085-inventory" (OuterVolumeSpecName: "inventory") pod "968fb64b-ed6e-492b-a508-41dd3dd98085" (UID: "968fb64b-ed6e-492b-a508-41dd3dd98085"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.059189 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/968fb64b-ed6e-492b-a508-41dd3dd98085-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "968fb64b-ed6e-492b-a508-41dd3dd98085" (UID: "968fb64b-ed6e-492b-a508-41dd3dd98085"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.134073 4783 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/968fb64b-ed6e-492b-a508-41dd3dd98085-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.134107 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/968fb64b-ed6e-492b-a508-41dd3dd98085-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.134119 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrm48\" (UniqueName: \"kubernetes.io/projected/968fb64b-ed6e-492b-a508-41dd3dd98085-kube-api-access-vrm48\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.557187 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" event={"ID":"968fb64b-ed6e-492b-a508-41dd3dd98085","Type":"ContainerDied","Data":"2aab1f5e73574e8c3060dfffde2ea17b84ce4151192e5eebece56e6cbe4739b1"} Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.557499 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2aab1f5e73574e8c3060dfffde2ea17b84ce4151192e5eebece56e6cbe4739b1" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.557239 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-trw4c" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.634052 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj"] Jan 31 09:23:10 crc kubenswrapper[4783]: E0131 09:23:10.634518 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="968fb64b-ed6e-492b-a508-41dd3dd98085" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.634541 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="968fb64b-ed6e-492b-a508-41dd3dd98085" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.634750 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="968fb64b-ed6e-492b-a508-41dd3dd98085" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.635393 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.637944 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.638044 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.638142 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.638647 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.642892 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj\" (UID: \"b55d8f8e-46cb-4119-a32b-723b06e29764\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.642955 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj\" (UID: \"b55d8f8e-46cb-4119-a32b-723b06e29764\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.642986 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj\" (UID: \"b55d8f8e-46cb-4119-a32b-723b06e29764\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.643028 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mmkd\" (UniqueName: \"kubernetes.io/projected/b55d8f8e-46cb-4119-a32b-723b06e29764-kube-api-access-6mmkd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj\" (UID: \"b55d8f8e-46cb-4119-a32b-723b06e29764\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.648939 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj"] Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.745725 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj\" (UID: \"b55d8f8e-46cb-4119-a32b-723b06e29764\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.745852 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj\" (UID: \"b55d8f8e-46cb-4119-a32b-723b06e29764\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.745888 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj\" (UID: \"b55d8f8e-46cb-4119-a32b-723b06e29764\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.745938 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mmkd\" (UniqueName: \"kubernetes.io/projected/b55d8f8e-46cb-4119-a32b-723b06e29764-kube-api-access-6mmkd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj\" (UID: \"b55d8f8e-46cb-4119-a32b-723b06e29764\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.752566 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj\" (UID: \"b55d8f8e-46cb-4119-a32b-723b06e29764\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.752623 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj\" (UID: \"b55d8f8e-46cb-4119-a32b-723b06e29764\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.752569 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj\" (UID: \"b55d8f8e-46cb-4119-a32b-723b06e29764\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.760990 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mmkd\" (UniqueName: \"kubernetes.io/projected/b55d8f8e-46cb-4119-a32b-723b06e29764-kube-api-access-6mmkd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj\" (UID: \"b55d8f8e-46cb-4119-a32b-723b06e29764\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" Jan 31 09:23:10 crc kubenswrapper[4783]: I0131 09:23:10.949254 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" Jan 31 09:23:11 crc kubenswrapper[4783]: I0131 09:23:11.438305 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj"] Jan 31 09:23:11 crc kubenswrapper[4783]: I0131 09:23:11.568281 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" event={"ID":"b55d8f8e-46cb-4119-a32b-723b06e29764","Type":"ContainerStarted","Data":"29287fe0a8191920623759fac1f7712620528e46d9ff02effa5e68a1a6d89605"} Jan 31 09:23:12 crc kubenswrapper[4783]: I0131 09:23:12.577466 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" event={"ID":"b55d8f8e-46cb-4119-a32b-723b06e29764","Type":"ContainerStarted","Data":"fb0e54a90bf740989460a12c23d8b1dd965937ddc1e1c2071ba08b595999d008"} Jan 31 09:23:12 crc kubenswrapper[4783]: I0131 09:23:12.600787 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" podStartSLOduration=2.099426673 podStartE2EDuration="2.600762193s" podCreationTimestamp="2026-01-31 09:23:10 +0000 UTC" firstStartedPulling="2026-01-31 09:23:11.427417908 +0000 UTC m=+1102.096101376" lastFinishedPulling="2026-01-31 09:23:11.928753427 +0000 UTC m=+1102.597436896" observedRunningTime="2026-01-31 09:23:12.590815512 +0000 UTC m=+1103.259498980" watchObservedRunningTime="2026-01-31 09:23:12.600762193 +0000 UTC m=+1103.269445662" Jan 31 09:23:17 crc kubenswrapper[4783]: I0131 09:23:17.756077 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:23:17 crc kubenswrapper[4783]: I0131 09:23:17.756597 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:23:47 crc kubenswrapper[4783]: I0131 09:23:47.756877 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:23:47 crc kubenswrapper[4783]: I0131 09:23:47.757449 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:23:59 crc kubenswrapper[4783]: I0131 09:23:59.121991 4783 scope.go:117] "RemoveContainer" containerID="9c8955ce5a44a6f8dab9197031364dc10ab88c7ade6d4c9985fb2908b0375e1c" Jan 31 09:24:17 crc kubenswrapper[4783]: I0131 09:24:17.756901 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:24:17 crc kubenswrapper[4783]: I0131 09:24:17.757474 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:24:17 crc kubenswrapper[4783]: I0131 09:24:17.757511 4783 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:24:17 crc kubenswrapper[4783]: I0131 09:24:17.758330 4783 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"59c827a13c686c020544e19ef18874c9811559e467147dcdea6ae441d681ed0d"} pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:24:17 crc kubenswrapper[4783]: I0131 09:24:17.758379 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" containerID="cri-o://59c827a13c686c020544e19ef18874c9811559e467147dcdea6ae441d681ed0d" gracePeriod=600 Jan 31 09:24:18 crc kubenswrapper[4783]: I0131 09:24:18.139682 4783 generic.go:334] "Generic (PLEG): container finished" podID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerID="59c827a13c686c020544e19ef18874c9811559e467147dcdea6ae441d681ed0d" exitCode=0 Jan 31 09:24:18 crc kubenswrapper[4783]: I0131 09:24:18.139755 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerDied","Data":"59c827a13c686c020544e19ef18874c9811559e467147dcdea6ae441d681ed0d"} Jan 31 09:24:18 crc kubenswrapper[4783]: I0131 09:24:18.139914 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerStarted","Data":"88bb33d5640838b31dc21da454049d6e8053db2a99a4f0698c705ac33568abca"} Jan 31 09:24:18 crc kubenswrapper[4783]: I0131 09:24:18.139940 4783 scope.go:117] "RemoveContainer" containerID="a28aa009c9b1798b35b666e609764f43f71694d4a62c6d2fec1ffdd0fb94bbed" Jan 31 09:24:59 crc kubenswrapper[4783]: I0131 09:24:59.180769 4783 scope.go:117] "RemoveContainer" containerID="b29a5cc127c8a868aa3a95c1ff709c7d0c5b580c5d71197fdd3e747de1a870c8" Jan 31 09:24:59 crc kubenswrapper[4783]: I0131 09:24:59.233679 4783 scope.go:117] "RemoveContainer" containerID="d9a7dd089c49c65762431a2e230d81aefcdb23477c5c4143dc88da366e13df57" Jan 31 09:26:07 crc kubenswrapper[4783]: I0131 09:26:07.034405 4783 generic.go:334] "Generic (PLEG): container finished" podID="b55d8f8e-46cb-4119-a32b-723b06e29764" containerID="fb0e54a90bf740989460a12c23d8b1dd965937ddc1e1c2071ba08b595999d008" exitCode=0 Jan 31 09:26:07 crc kubenswrapper[4783]: I0131 09:26:07.034476 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" event={"ID":"b55d8f8e-46cb-4119-a32b-723b06e29764","Type":"ContainerDied","Data":"fb0e54a90bf740989460a12c23d8b1dd965937ddc1e1c2071ba08b595999d008"} Jan 31 09:26:08 crc kubenswrapper[4783]: I0131 09:26:08.360663 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" Jan 31 09:26:08 crc kubenswrapper[4783]: I0131 09:26:08.529952 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-inventory\") pod \"b55d8f8e-46cb-4119-a32b-723b06e29764\" (UID: \"b55d8f8e-46cb-4119-a32b-723b06e29764\") " Jan 31 09:26:08 crc kubenswrapper[4783]: I0131 09:26:08.530349 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mmkd\" (UniqueName: \"kubernetes.io/projected/b55d8f8e-46cb-4119-a32b-723b06e29764-kube-api-access-6mmkd\") pod \"b55d8f8e-46cb-4119-a32b-723b06e29764\" (UID: \"b55d8f8e-46cb-4119-a32b-723b06e29764\") " Jan 31 09:26:08 crc kubenswrapper[4783]: I0131 09:26:08.530396 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-ssh-key-openstack-edpm-ipam\") pod \"b55d8f8e-46cb-4119-a32b-723b06e29764\" (UID: \"b55d8f8e-46cb-4119-a32b-723b06e29764\") " Jan 31 09:26:08 crc kubenswrapper[4783]: I0131 09:26:08.530513 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-bootstrap-combined-ca-bundle\") pod \"b55d8f8e-46cb-4119-a32b-723b06e29764\" (UID: \"b55d8f8e-46cb-4119-a32b-723b06e29764\") " Jan 31 09:26:08 crc kubenswrapper[4783]: I0131 09:26:08.536881 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b55d8f8e-46cb-4119-a32b-723b06e29764" (UID: "b55d8f8e-46cb-4119-a32b-723b06e29764"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:26:08 crc kubenswrapper[4783]: I0131 09:26:08.536959 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b55d8f8e-46cb-4119-a32b-723b06e29764-kube-api-access-6mmkd" (OuterVolumeSpecName: "kube-api-access-6mmkd") pod "b55d8f8e-46cb-4119-a32b-723b06e29764" (UID: "b55d8f8e-46cb-4119-a32b-723b06e29764"). InnerVolumeSpecName "kube-api-access-6mmkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:26:08 crc kubenswrapper[4783]: I0131 09:26:08.554914 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b55d8f8e-46cb-4119-a32b-723b06e29764" (UID: "b55d8f8e-46cb-4119-a32b-723b06e29764"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:26:08 crc kubenswrapper[4783]: I0131 09:26:08.557319 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-inventory" (OuterVolumeSpecName: "inventory") pod "b55d8f8e-46cb-4119-a32b-723b06e29764" (UID: "b55d8f8e-46cb-4119-a32b-723b06e29764"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:26:08 crc kubenswrapper[4783]: I0131 09:26:08.632268 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:26:08 crc kubenswrapper[4783]: I0131 09:26:08.632299 4783 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:26:08 crc kubenswrapper[4783]: I0131 09:26:08.632312 4783 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b55d8f8e-46cb-4119-a32b-723b06e29764-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:26:08 crc kubenswrapper[4783]: I0131 09:26:08.632345 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mmkd\" (UniqueName: \"kubernetes.io/projected/b55d8f8e-46cb-4119-a32b-723b06e29764-kube-api-access-6mmkd\") on node \"crc\" DevicePath \"\"" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.054354 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" event={"ID":"b55d8f8e-46cb-4119-a32b-723b06e29764","Type":"ContainerDied","Data":"29287fe0a8191920623759fac1f7712620528e46d9ff02effa5e68a1a6d89605"} Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.054737 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29287fe0a8191920623759fac1f7712620528e46d9ff02effa5e68a1a6d89605" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.054397 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.122465 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc"] Jan 31 09:26:09 crc kubenswrapper[4783]: E0131 09:26:09.123179 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b55d8f8e-46cb-4119-a32b-723b06e29764" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.123317 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="b55d8f8e-46cb-4119-a32b-723b06e29764" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.123649 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="b55d8f8e-46cb-4119-a32b-723b06e29764" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.124487 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.125872 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.125926 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.126511 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.126573 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.134882 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc"] Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.147030 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6315c3c-0101-4935-b081-37414dd7e27e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc\" (UID: \"f6315c3c-0101-4935-b081-37414dd7e27e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.147246 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f6315c3c-0101-4935-b081-37414dd7e27e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc\" (UID: \"f6315c3c-0101-4935-b081-37414dd7e27e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.147408 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pjk2\" (UniqueName: \"kubernetes.io/projected/f6315c3c-0101-4935-b081-37414dd7e27e-kube-api-access-5pjk2\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc\" (UID: \"f6315c3c-0101-4935-b081-37414dd7e27e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.248623 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6315c3c-0101-4935-b081-37414dd7e27e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc\" (UID: \"f6315c3c-0101-4935-b081-37414dd7e27e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.248703 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f6315c3c-0101-4935-b081-37414dd7e27e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc\" (UID: \"f6315c3c-0101-4935-b081-37414dd7e27e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.248758 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pjk2\" (UniqueName: \"kubernetes.io/projected/f6315c3c-0101-4935-b081-37414dd7e27e-kube-api-access-5pjk2\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc\" (UID: \"f6315c3c-0101-4935-b081-37414dd7e27e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.253063 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6315c3c-0101-4935-b081-37414dd7e27e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc\" (UID: \"f6315c3c-0101-4935-b081-37414dd7e27e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.253574 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f6315c3c-0101-4935-b081-37414dd7e27e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc\" (UID: \"f6315c3c-0101-4935-b081-37414dd7e27e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.264152 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pjk2\" (UniqueName: \"kubernetes.io/projected/f6315c3c-0101-4935-b081-37414dd7e27e-kube-api-access-5pjk2\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc\" (UID: \"f6315c3c-0101-4935-b081-37414dd7e27e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.439478 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" Jan 31 09:26:09 crc kubenswrapper[4783]: I0131 09:26:09.891262 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc"] Jan 31 09:26:10 crc kubenswrapper[4783]: I0131 09:26:10.063403 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" event={"ID":"f6315c3c-0101-4935-b081-37414dd7e27e","Type":"ContainerStarted","Data":"a7645184a20b5f72d68befaf2005337db877fc857da3b4e810128064aee306ec"} Jan 31 09:26:11 crc kubenswrapper[4783]: I0131 09:26:11.072600 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" event={"ID":"f6315c3c-0101-4935-b081-37414dd7e27e","Type":"ContainerStarted","Data":"e5e8e9732960938e09bfd38fe8996b5e79beeae3d9e601b25aa664f21379cb35"} Jan 31 09:26:11 crc kubenswrapper[4783]: I0131 09:26:11.086931 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" podStartSLOduration=1.536458948 podStartE2EDuration="2.086915711s" podCreationTimestamp="2026-01-31 09:26:09 +0000 UTC" firstStartedPulling="2026-01-31 09:26:09.89451856 +0000 UTC m=+1280.563202029" lastFinishedPulling="2026-01-31 09:26:10.444975324 +0000 UTC m=+1281.113658792" observedRunningTime="2026-01-31 09:26:11.084441625 +0000 UTC m=+1281.753125094" watchObservedRunningTime="2026-01-31 09:26:11.086915711 +0000 UTC m=+1281.755599179" Jan 31 09:26:47 crc kubenswrapper[4783]: I0131 09:26:47.756665 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:26:47 crc kubenswrapper[4783]: I0131 09:26:47.757146 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:27:17 crc kubenswrapper[4783]: I0131 09:27:17.757289 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:27:17 crc kubenswrapper[4783]: I0131 09:27:17.757907 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:27:47 crc kubenswrapper[4783]: I0131 09:27:47.756747 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:27:47 crc kubenswrapper[4783]: I0131 09:27:47.757411 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:27:47 crc kubenswrapper[4783]: I0131 09:27:47.757456 4783 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:27:47 crc kubenswrapper[4783]: I0131 09:27:47.758288 4783 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88bb33d5640838b31dc21da454049d6e8053db2a99a4f0698c705ac33568abca"} pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:27:47 crc kubenswrapper[4783]: I0131 09:27:47.758353 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" containerID="cri-o://88bb33d5640838b31dc21da454049d6e8053db2a99a4f0698c705ac33568abca" gracePeriod=600 Jan 31 09:27:48 crc kubenswrapper[4783]: I0131 09:27:48.858460 4783 generic.go:334] "Generic (PLEG): container finished" podID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerID="88bb33d5640838b31dc21da454049d6e8053db2a99a4f0698c705ac33568abca" exitCode=0 Jan 31 09:27:48 crc kubenswrapper[4783]: I0131 09:27:48.858534 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerDied","Data":"88bb33d5640838b31dc21da454049d6e8053db2a99a4f0698c705ac33568abca"} Jan 31 09:27:48 crc kubenswrapper[4783]: I0131 09:27:48.858945 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerStarted","Data":"8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b"} Jan 31 09:27:48 crc kubenswrapper[4783]: I0131 09:27:48.858971 4783 scope.go:117] "RemoveContainer" containerID="59c827a13c686c020544e19ef18874c9811559e467147dcdea6ae441d681ed0d" Jan 31 09:27:50 crc kubenswrapper[4783]: I0131 09:27:50.037421 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2ed3-account-create-update-xmcll"] Jan 31 09:27:50 crc kubenswrapper[4783]: I0131 09:27:50.044459 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-jf4sp"] Jan 31 09:27:50 crc kubenswrapper[4783]: I0131 09:27:50.050725 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2ed3-account-create-update-xmcll"] Jan 31 09:27:50 crc kubenswrapper[4783]: I0131 09:27:50.056108 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-jf4sp"] Jan 31 09:27:51 crc kubenswrapper[4783]: I0131 09:27:51.031100 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-czqdq"] Jan 31 09:27:51 crc kubenswrapper[4783]: I0131 09:27:51.040909 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b696-account-create-update-4xlwd"] Jan 31 09:27:51 crc kubenswrapper[4783]: I0131 09:27:51.047932 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-18c3-account-create-update-bb8tv"] Jan 31 09:27:51 crc kubenswrapper[4783]: I0131 09:27:51.057157 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-9cg97"] Jan 31 09:27:51 crc kubenswrapper[4783]: I0131 09:27:51.064469 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-9cg97"] Jan 31 09:27:51 crc kubenswrapper[4783]: I0131 09:27:51.070264 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b696-account-create-update-4xlwd"] Jan 31 09:27:51 crc kubenswrapper[4783]: I0131 09:27:51.085275 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-18c3-account-create-update-bb8tv"] Jan 31 09:27:51 crc kubenswrapper[4783]: I0131 09:27:51.100205 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-czqdq"] Jan 31 09:27:51 crc kubenswrapper[4783]: I0131 09:27:51.658662 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="046f6bf6-2c59-43f9-8964-5949209241b5" path="/var/lib/kubelet/pods/046f6bf6-2c59-43f9-8964-5949209241b5/volumes" Jan 31 09:27:51 crc kubenswrapper[4783]: I0131 09:27:51.659941 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07cd4eec-ab95-4246-ab07-30bd4b8d6b9e" path="/var/lib/kubelet/pods/07cd4eec-ab95-4246-ab07-30bd4b8d6b9e/volumes" Jan 31 09:27:51 crc kubenswrapper[4783]: I0131 09:27:51.660644 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f714661-d55e-4c8f-b2c2-8420206b1a72" path="/var/lib/kubelet/pods/3f714661-d55e-4c8f-b2c2-8420206b1a72/volumes" Jan 31 09:27:51 crc kubenswrapper[4783]: I0131 09:27:51.661330 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bdad584-8c0f-433e-b36d-1a8584cecc18" path="/var/lib/kubelet/pods/6bdad584-8c0f-433e-b36d-1a8584cecc18/volumes" Jan 31 09:27:51 crc kubenswrapper[4783]: I0131 09:27:51.662984 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec5fc027-5dcc-47d7-972a-ddf14c314725" path="/var/lib/kubelet/pods/ec5fc027-5dcc-47d7-972a-ddf14c314725/volumes" Jan 31 09:27:51 crc kubenswrapper[4783]: I0131 09:27:51.663636 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd3660ec-5394-45d9-ae35-5c23e4749178" path="/var/lib/kubelet/pods/fd3660ec-5394-45d9-ae35-5c23e4749178/volumes" Jan 31 09:27:56 crc kubenswrapper[4783]: I0131 09:27:56.930557 4783 generic.go:334] "Generic (PLEG): container finished" podID="f6315c3c-0101-4935-b081-37414dd7e27e" containerID="e5e8e9732960938e09bfd38fe8996b5e79beeae3d9e601b25aa664f21379cb35" exitCode=0 Jan 31 09:27:56 crc kubenswrapper[4783]: I0131 09:27:56.930674 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" event={"ID":"f6315c3c-0101-4935-b081-37414dd7e27e","Type":"ContainerDied","Data":"e5e8e9732960938e09bfd38fe8996b5e79beeae3d9e601b25aa664f21379cb35"} Jan 31 09:27:58 crc kubenswrapper[4783]: I0131 09:27:58.028610 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-x7h7f"] Jan 31 09:27:58 crc kubenswrapper[4783]: I0131 09:27:58.035141 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-x7h7f"] Jan 31 09:27:58 crc kubenswrapper[4783]: I0131 09:27:58.265471 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" Jan 31 09:27:58 crc kubenswrapper[4783]: I0131 09:27:58.395192 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6315c3c-0101-4935-b081-37414dd7e27e-inventory\") pod \"f6315c3c-0101-4935-b081-37414dd7e27e\" (UID: \"f6315c3c-0101-4935-b081-37414dd7e27e\") " Jan 31 09:27:58 crc kubenswrapper[4783]: I0131 09:27:58.395314 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f6315c3c-0101-4935-b081-37414dd7e27e-ssh-key-openstack-edpm-ipam\") pod \"f6315c3c-0101-4935-b081-37414dd7e27e\" (UID: \"f6315c3c-0101-4935-b081-37414dd7e27e\") " Jan 31 09:27:58 crc kubenswrapper[4783]: I0131 09:27:58.395339 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pjk2\" (UniqueName: \"kubernetes.io/projected/f6315c3c-0101-4935-b081-37414dd7e27e-kube-api-access-5pjk2\") pod \"f6315c3c-0101-4935-b081-37414dd7e27e\" (UID: \"f6315c3c-0101-4935-b081-37414dd7e27e\") " Jan 31 09:27:58 crc kubenswrapper[4783]: I0131 09:27:58.400578 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6315c3c-0101-4935-b081-37414dd7e27e-kube-api-access-5pjk2" (OuterVolumeSpecName: "kube-api-access-5pjk2") pod "f6315c3c-0101-4935-b081-37414dd7e27e" (UID: "f6315c3c-0101-4935-b081-37414dd7e27e"). InnerVolumeSpecName "kube-api-access-5pjk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:27:58 crc kubenswrapper[4783]: I0131 09:27:58.418747 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6315c3c-0101-4935-b081-37414dd7e27e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f6315c3c-0101-4935-b081-37414dd7e27e" (UID: "f6315c3c-0101-4935-b081-37414dd7e27e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:27:58 crc kubenswrapper[4783]: I0131 09:27:58.419103 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6315c3c-0101-4935-b081-37414dd7e27e-inventory" (OuterVolumeSpecName: "inventory") pod "f6315c3c-0101-4935-b081-37414dd7e27e" (UID: "f6315c3c-0101-4935-b081-37414dd7e27e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:27:58 crc kubenswrapper[4783]: I0131 09:27:58.498287 4783 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6315c3c-0101-4935-b081-37414dd7e27e-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:27:58 crc kubenswrapper[4783]: I0131 09:27:58.498341 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f6315c3c-0101-4935-b081-37414dd7e27e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:27:58 crc kubenswrapper[4783]: I0131 09:27:58.498357 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pjk2\" (UniqueName: \"kubernetes.io/projected/f6315c3c-0101-4935-b081-37414dd7e27e-kube-api-access-5pjk2\") on node \"crc\" DevicePath \"\"" Jan 31 09:27:58 crc kubenswrapper[4783]: I0131 09:27:58.947455 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" event={"ID":"f6315c3c-0101-4935-b081-37414dd7e27e","Type":"ContainerDied","Data":"a7645184a20b5f72d68befaf2005337db877fc857da3b4e810128064aee306ec"} Jan 31 09:27:58 crc kubenswrapper[4783]: I0131 09:27:58.947502 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc" Jan 31 09:27:58 crc kubenswrapper[4783]: I0131 09:27:58.947515 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7645184a20b5f72d68befaf2005337db877fc857da3b4e810128064aee306ec" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.011103 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj"] Jan 31 09:27:59 crc kubenswrapper[4783]: E0131 09:27:59.011454 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6315c3c-0101-4935-b081-37414dd7e27e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.011473 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6315c3c-0101-4935-b081-37414dd7e27e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.011654 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6315c3c-0101-4935-b081-37414dd7e27e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.012191 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.015110 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.015183 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.015503 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.017085 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.025382 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj"] Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.111112 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75m6c\" (UniqueName: \"kubernetes.io/projected/4c333cb5-3633-4cfe-825d-abc93c751acd-kube-api-access-75m6c\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj\" (UID: \"4c333cb5-3633-4cfe-825d-abc93c751acd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.111264 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c333cb5-3633-4cfe-825d-abc93c751acd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj\" (UID: \"4c333cb5-3633-4cfe-825d-abc93c751acd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.111297 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c333cb5-3633-4cfe-825d-abc93c751acd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj\" (UID: \"4c333cb5-3633-4cfe-825d-abc93c751acd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.212546 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75m6c\" (UniqueName: \"kubernetes.io/projected/4c333cb5-3633-4cfe-825d-abc93c751acd-kube-api-access-75m6c\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj\" (UID: \"4c333cb5-3633-4cfe-825d-abc93c751acd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.212859 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c333cb5-3633-4cfe-825d-abc93c751acd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj\" (UID: \"4c333cb5-3633-4cfe-825d-abc93c751acd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.212884 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c333cb5-3633-4cfe-825d-abc93c751acd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj\" (UID: \"4c333cb5-3633-4cfe-825d-abc93c751acd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.217501 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c333cb5-3633-4cfe-825d-abc93c751acd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj\" (UID: \"4c333cb5-3633-4cfe-825d-abc93c751acd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.217853 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c333cb5-3633-4cfe-825d-abc93c751acd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj\" (UID: \"4c333cb5-3633-4cfe-825d-abc93c751acd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.225134 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75m6c\" (UniqueName: \"kubernetes.io/projected/4c333cb5-3633-4cfe-825d-abc93c751acd-kube-api-access-75m6c\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj\" (UID: \"4c333cb5-3633-4cfe-825d-abc93c751acd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.326003 4783 scope.go:117] "RemoveContainer" containerID="5ba8e19f22990ec5e7786723159e19551167ec369babf57c45970338eda81d3e" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.327486 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.367344 4783 scope.go:117] "RemoveContainer" containerID="4113d6f5019e94f91242acbd37473b5b69b8d49e9cf65b4cec6f0a499f7266ae" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.434813 4783 scope.go:117] "RemoveContainer" containerID="0a930ede93072e683441e4fb215175a2a0361872579f82d076792fa9c90a3bcb" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.472658 4783 scope.go:117] "RemoveContainer" containerID="0c9a8cb448fa4629e4e00d219bf81c36a6956474c9eea01dc7f29310e8cda6fd" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.490735 4783 scope.go:117] "RemoveContainer" containerID="3a68dc5e2c08d6cd779d6d73430d6226a5f55ea0545148e20ccb80c4b4020d6a" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.510645 4783 scope.go:117] "RemoveContainer" containerID="ba2e5aef78807d8ab62365575c0206ab61f11d3c8bb4987394232545fa2e7001" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.660058 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c6c47dd-3744-4c2e-84cd-8b6278d16ee3" path="/var/lib/kubelet/pods/7c6c47dd-3744-4c2e-84cd-8b6278d16ee3/volumes" Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.792105 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj"] Jan 31 09:27:59 crc kubenswrapper[4783]: W0131 09:27:59.792592 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c333cb5_3633_4cfe_825d_abc93c751acd.slice/crio-7c3c5bfeaac3e68da626c9c2405cb37a92b0fff4f3977b23f8ef79c707ed45c5 WatchSource:0}: Error finding container 7c3c5bfeaac3e68da626c9c2405cb37a92b0fff4f3977b23f8ef79c707ed45c5: Status 404 returned error can't find the container with id 7c3c5bfeaac3e68da626c9c2405cb37a92b0fff4f3977b23f8ef79c707ed45c5 Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.795390 4783 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:27:59 crc kubenswrapper[4783]: I0131 09:27:59.957049 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" event={"ID":"4c333cb5-3633-4cfe-825d-abc93c751acd","Type":"ContainerStarted","Data":"7c3c5bfeaac3e68da626c9c2405cb37a92b0fff4f3977b23f8ef79c707ed45c5"} Jan 31 09:28:00 crc kubenswrapper[4783]: I0131 09:28:00.970199 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" event={"ID":"4c333cb5-3633-4cfe-825d-abc93c751acd","Type":"ContainerStarted","Data":"49e03c2d905af996ae822df4d5eca25041bb7df78730f270c476946cf55144dd"} Jan 31 09:28:00 crc kubenswrapper[4783]: I0131 09:28:00.990009 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" podStartSLOduration=2.50675455 podStartE2EDuration="2.989994245s" podCreationTimestamp="2026-01-31 09:27:58 +0000 UTC" firstStartedPulling="2026-01-31 09:27:59.795150829 +0000 UTC m=+1390.463834297" lastFinishedPulling="2026-01-31 09:28:00.278390524 +0000 UTC m=+1390.947073992" observedRunningTime="2026-01-31 09:28:00.984819186 +0000 UTC m=+1391.653502655" watchObservedRunningTime="2026-01-31 09:28:00.989994245 +0000 UTC m=+1391.658677713" Jan 31 09:28:11 crc kubenswrapper[4783]: I0131 09:28:11.036579 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-npz7m"] Jan 31 09:28:11 crc kubenswrapper[4783]: I0131 09:28:11.044482 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-npz7m"] Jan 31 09:28:11 crc kubenswrapper[4783]: I0131 09:28:11.656822 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64f2608f-7802-42b7-bc8a-2b1dbf829514" path="/var/lib/kubelet/pods/64f2608f-7802-42b7-bc8a-2b1dbf829514/volumes" Jan 31 09:28:27 crc kubenswrapper[4783]: I0131 09:28:27.026507 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7909-account-create-update-5jfvb"] Jan 31 09:28:27 crc kubenswrapper[4783]: I0131 09:28:27.035805 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-22mw9"] Jan 31 09:28:27 crc kubenswrapper[4783]: I0131 09:28:27.047154 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7909-account-create-update-5jfvb"] Jan 31 09:28:27 crc kubenswrapper[4783]: I0131 09:28:27.053677 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-22mw9"] Jan 31 09:28:27 crc kubenswrapper[4783]: I0131 09:28:27.654762 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ac471b1-537d-4498-90ea-0c8ccb699ae8" path="/var/lib/kubelet/pods/0ac471b1-537d-4498-90ea-0c8ccb699ae8/volumes" Jan 31 09:28:27 crc kubenswrapper[4783]: I0131 09:28:27.655510 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37863ae7-16e3-4030-9d28-9f9e312e941a" path="/var/lib/kubelet/pods/37863ae7-16e3-4030-9d28-9f9e312e941a/volumes" Jan 31 09:28:30 crc kubenswrapper[4783]: I0131 09:28:30.032709 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-xkw48"] Jan 31 09:28:30 crc kubenswrapper[4783]: I0131 09:28:30.040867 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-496c-account-create-update-6jkmm"] Jan 31 09:28:30 crc kubenswrapper[4783]: I0131 09:28:30.047867 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-h5m5k"] Jan 31 09:28:30 crc kubenswrapper[4783]: I0131 09:28:30.055350 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0fd3-account-create-update-9x4ll"] Jan 31 09:28:30 crc kubenswrapper[4783]: I0131 09:28:30.061220 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-xkw48"] Jan 31 09:28:30 crc kubenswrapper[4783]: I0131 09:28:30.066630 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-496c-account-create-update-6jkmm"] Jan 31 09:28:30 crc kubenswrapper[4783]: I0131 09:28:30.071808 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0fd3-account-create-update-9x4ll"] Jan 31 09:28:30 crc kubenswrapper[4783]: I0131 09:28:30.076755 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-h5m5k"] Jan 31 09:28:31 crc kubenswrapper[4783]: I0131 09:28:31.655474 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12869632-705f-4514-b2b1-d2eb29bc986b" path="/var/lib/kubelet/pods/12869632-705f-4514-b2b1-d2eb29bc986b/volumes" Jan 31 09:28:31 crc kubenswrapper[4783]: I0131 09:28:31.656525 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="181145e5-44a7-448b-b5a3-f6f03f325c01" path="/var/lib/kubelet/pods/181145e5-44a7-448b-b5a3-f6f03f325c01/volumes" Jan 31 09:28:31 crc kubenswrapper[4783]: I0131 09:28:31.657022 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de0e2db-9b04-476b-a09f-7cf7415ba0e7" path="/var/lib/kubelet/pods/2de0e2db-9b04-476b-a09f-7cf7415ba0e7/volumes" Jan 31 09:28:31 crc kubenswrapper[4783]: I0131 09:28:31.657574 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6502457-d446-4d87-ab1b-aae6fd53f95a" path="/var/lib/kubelet/pods/a6502457-d446-4d87-ab1b-aae6fd53f95a/volumes" Jan 31 09:28:33 crc kubenswrapper[4783]: I0131 09:28:33.027673 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-p9k2r"] Jan 31 09:28:33 crc kubenswrapper[4783]: I0131 09:28:33.033068 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-p9k2r"] Jan 31 09:28:33 crc kubenswrapper[4783]: I0131 09:28:33.656797 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d851f24-0013-417f-9734-d28d22b27057" path="/var/lib/kubelet/pods/4d851f24-0013-417f-9734-d28d22b27057/volumes" Jan 31 09:28:50 crc kubenswrapper[4783]: I0131 09:28:50.024564 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-7mds6"] Jan 31 09:28:50 crc kubenswrapper[4783]: I0131 09:28:50.030445 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-7mds6"] Jan 31 09:28:51 crc kubenswrapper[4783]: I0131 09:28:51.654227 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9" path="/var/lib/kubelet/pods/ca7e6f7a-4b59-42fd-9ef2-4f761e2d0af9/volumes" Jan 31 09:28:52 crc kubenswrapper[4783]: I0131 09:28:52.423675 4783 generic.go:334] "Generic (PLEG): container finished" podID="4c333cb5-3633-4cfe-825d-abc93c751acd" containerID="49e03c2d905af996ae822df4d5eca25041bb7df78730f270c476946cf55144dd" exitCode=0 Jan 31 09:28:52 crc kubenswrapper[4783]: I0131 09:28:52.423738 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" event={"ID":"4c333cb5-3633-4cfe-825d-abc93c751acd","Type":"ContainerDied","Data":"49e03c2d905af996ae822df4d5eca25041bb7df78730f270c476946cf55144dd"} Jan 31 09:28:53 crc kubenswrapper[4783]: I0131 09:28:53.748431 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" Jan 31 09:28:53 crc kubenswrapper[4783]: I0131 09:28:53.919509 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c333cb5-3633-4cfe-825d-abc93c751acd-ssh-key-openstack-edpm-ipam\") pod \"4c333cb5-3633-4cfe-825d-abc93c751acd\" (UID: \"4c333cb5-3633-4cfe-825d-abc93c751acd\") " Jan 31 09:28:53 crc kubenswrapper[4783]: I0131 09:28:53.919831 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c333cb5-3633-4cfe-825d-abc93c751acd-inventory\") pod \"4c333cb5-3633-4cfe-825d-abc93c751acd\" (UID: \"4c333cb5-3633-4cfe-825d-abc93c751acd\") " Jan 31 09:28:53 crc kubenswrapper[4783]: I0131 09:28:53.919865 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75m6c\" (UniqueName: \"kubernetes.io/projected/4c333cb5-3633-4cfe-825d-abc93c751acd-kube-api-access-75m6c\") pod \"4c333cb5-3633-4cfe-825d-abc93c751acd\" (UID: \"4c333cb5-3633-4cfe-825d-abc93c751acd\") " Jan 31 09:28:53 crc kubenswrapper[4783]: I0131 09:28:53.925518 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c333cb5-3633-4cfe-825d-abc93c751acd-kube-api-access-75m6c" (OuterVolumeSpecName: "kube-api-access-75m6c") pod "4c333cb5-3633-4cfe-825d-abc93c751acd" (UID: "4c333cb5-3633-4cfe-825d-abc93c751acd"). InnerVolumeSpecName "kube-api-access-75m6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:28:53 crc kubenswrapper[4783]: I0131 09:28:53.943529 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c333cb5-3633-4cfe-825d-abc93c751acd-inventory" (OuterVolumeSpecName: "inventory") pod "4c333cb5-3633-4cfe-825d-abc93c751acd" (UID: "4c333cb5-3633-4cfe-825d-abc93c751acd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:28:53 crc kubenswrapper[4783]: I0131 09:28:53.943855 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c333cb5-3633-4cfe-825d-abc93c751acd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4c333cb5-3633-4cfe-825d-abc93c751acd" (UID: "4c333cb5-3633-4cfe-825d-abc93c751acd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.022437 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4c333cb5-3633-4cfe-825d-abc93c751acd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.022471 4783 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c333cb5-3633-4cfe-825d-abc93c751acd-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.022482 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75m6c\" (UniqueName: \"kubernetes.io/projected/4c333cb5-3633-4cfe-825d-abc93c751acd-kube-api-access-75m6c\") on node \"crc\" DevicePath \"\"" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.441006 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" event={"ID":"4c333cb5-3633-4cfe-825d-abc93c751acd","Type":"ContainerDied","Data":"7c3c5bfeaac3e68da626c9c2405cb37a92b0fff4f3977b23f8ef79c707ed45c5"} Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.441045 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c3c5bfeaac3e68da626c9c2405cb37a92b0fff4f3977b23f8ef79c707ed45c5" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.441066 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.508254 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd"] Jan 31 09:28:54 crc kubenswrapper[4783]: E0131 09:28:54.508746 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c333cb5-3633-4cfe-825d-abc93c751acd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.508770 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c333cb5-3633-4cfe-825d-abc93c751acd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.508992 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c333cb5-3633-4cfe-825d-abc93c751acd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.509681 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.511216 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.511350 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.514667 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.514819 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.516665 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd"] Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.532423 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpmzs\" (UniqueName: \"kubernetes.io/projected/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-kube-api-access-gpmzs\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-97pwd\" (UID: \"1f0db336-229b-4d05-b3e9-aaa8b26b08c4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.532465 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-97pwd\" (UID: \"1f0db336-229b-4d05-b3e9-aaa8b26b08c4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.532500 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-97pwd\" (UID: \"1f0db336-229b-4d05-b3e9-aaa8b26b08c4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.633954 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpmzs\" (UniqueName: \"kubernetes.io/projected/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-kube-api-access-gpmzs\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-97pwd\" (UID: \"1f0db336-229b-4d05-b3e9-aaa8b26b08c4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.634036 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-97pwd\" (UID: \"1f0db336-229b-4d05-b3e9-aaa8b26b08c4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.634064 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-97pwd\" (UID: \"1f0db336-229b-4d05-b3e9-aaa8b26b08c4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.637970 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-97pwd\" (UID: \"1f0db336-229b-4d05-b3e9-aaa8b26b08c4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.639096 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-97pwd\" (UID: \"1f0db336-229b-4d05-b3e9-aaa8b26b08c4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.648333 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpmzs\" (UniqueName: \"kubernetes.io/projected/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-kube-api-access-gpmzs\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-97pwd\" (UID: \"1f0db336-229b-4d05-b3e9-aaa8b26b08c4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" Jan 31 09:28:54 crc kubenswrapper[4783]: I0131 09:28:54.824080 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" Jan 31 09:28:55 crc kubenswrapper[4783]: I0131 09:28:55.262976 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd"] Jan 31 09:28:55 crc kubenswrapper[4783]: I0131 09:28:55.449274 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" event={"ID":"1f0db336-229b-4d05-b3e9-aaa8b26b08c4","Type":"ContainerStarted","Data":"e51288ff6041905d9c535c238ffaedc588d7f52822354946aef624ac187ea37c"} Jan 31 09:28:56 crc kubenswrapper[4783]: I0131 09:28:56.458983 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" event={"ID":"1f0db336-229b-4d05-b3e9-aaa8b26b08c4","Type":"ContainerStarted","Data":"27913826597d4144776a57b7300fb1c82c0aeb09f25f406c1f5e172d245fa1d6"} Jan 31 09:28:56 crc kubenswrapper[4783]: I0131 09:28:56.477316 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" podStartSLOduration=1.9924444430000001 podStartE2EDuration="2.47730268s" podCreationTimestamp="2026-01-31 09:28:54 +0000 UTC" firstStartedPulling="2026-01-31 09:28:55.266787583 +0000 UTC m=+1445.935471051" lastFinishedPulling="2026-01-31 09:28:55.75164582 +0000 UTC m=+1446.420329288" observedRunningTime="2026-01-31 09:28:56.470562109 +0000 UTC m=+1447.139245578" watchObservedRunningTime="2026-01-31 09:28:56.47730268 +0000 UTC m=+1447.145986148" Jan 31 09:28:59 crc kubenswrapper[4783]: I0131 09:28:59.486603 4783 generic.go:334] "Generic (PLEG): container finished" podID="1f0db336-229b-4d05-b3e9-aaa8b26b08c4" containerID="27913826597d4144776a57b7300fb1c82c0aeb09f25f406c1f5e172d245fa1d6" exitCode=0 Jan 31 09:28:59 crc kubenswrapper[4783]: I0131 09:28:59.486708 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" event={"ID":"1f0db336-229b-4d05-b3e9-aaa8b26b08c4","Type":"ContainerDied","Data":"27913826597d4144776a57b7300fb1c82c0aeb09f25f406c1f5e172d245fa1d6"} Jan 31 09:28:59 crc kubenswrapper[4783]: I0131 09:28:59.618665 4783 scope.go:117] "RemoveContainer" containerID="f51de41216bf7f21fcebd2b853cabe4c72d511b06aa125c331efed24a22160d9" Jan 31 09:28:59 crc kubenswrapper[4783]: I0131 09:28:59.653559 4783 scope.go:117] "RemoveContainer" containerID="8cddf7ac4a81c13114c84391a5017d5aca00eecc478930b10b608a8c2b1f6ddc" Jan 31 09:28:59 crc kubenswrapper[4783]: I0131 09:28:59.682688 4783 scope.go:117] "RemoveContainer" containerID="a17c1c50244939cae88c79b5c8a17429354156e7d7fa02fd3c273caaa04394ca" Jan 31 09:28:59 crc kubenswrapper[4783]: I0131 09:28:59.733938 4783 scope.go:117] "RemoveContainer" containerID="4c3ad5d4efa5f8761849e7c13dd01b28de263966ea54c345a85746b51cfe6323" Jan 31 09:28:59 crc kubenswrapper[4783]: I0131 09:28:59.776213 4783 scope.go:117] "RemoveContainer" containerID="beb762ada300dca21fab34d7f101d847437b4613ce8f4d1435b4bb65234feb2c" Jan 31 09:28:59 crc kubenswrapper[4783]: I0131 09:28:59.794869 4783 scope.go:117] "RemoveContainer" containerID="150e5e0164b552e8488ac8a43500e605706978ef8990e67ecf3ee6a5e5f27f0b" Jan 31 09:28:59 crc kubenswrapper[4783]: I0131 09:28:59.848639 4783 scope.go:117] "RemoveContainer" containerID="34d32922b8a575ca56d1ec1e4a5c7b47e34f1c8ae94ab4f026da18bb69ef8b30" Jan 31 09:28:59 crc kubenswrapper[4783]: I0131 09:28:59.867658 4783 scope.go:117] "RemoveContainer" containerID="3e392250f5a57eeb4ae09672f984aacd6138ff0541a50f2720f57ebdf40b49d2" Jan 31 09:28:59 crc kubenswrapper[4783]: I0131 09:28:59.897807 4783 scope.go:117] "RemoveContainer" containerID="76dd931b2b6a05e8c9b2f971eb4750f62ab7a839ea67268ca8fbf2909701b98c" Jan 31 09:28:59 crc kubenswrapper[4783]: I0131 09:28:59.915225 4783 scope.go:117] "RemoveContainer" containerID="58612a40d7d85fca3d09f85cde23b1b012a2b2dce1840cc535b9f3510b7ebad5" Jan 31 09:29:00 crc kubenswrapper[4783]: I0131 09:29:00.794483 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" Jan 31 09:29:00 crc kubenswrapper[4783]: I0131 09:29:00.950228 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-ssh-key-openstack-edpm-ipam\") pod \"1f0db336-229b-4d05-b3e9-aaa8b26b08c4\" (UID: \"1f0db336-229b-4d05-b3e9-aaa8b26b08c4\") " Jan 31 09:29:00 crc kubenswrapper[4783]: I0131 09:29:00.950306 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpmzs\" (UniqueName: \"kubernetes.io/projected/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-kube-api-access-gpmzs\") pod \"1f0db336-229b-4d05-b3e9-aaa8b26b08c4\" (UID: \"1f0db336-229b-4d05-b3e9-aaa8b26b08c4\") " Jan 31 09:29:00 crc kubenswrapper[4783]: I0131 09:29:00.950355 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-inventory\") pod \"1f0db336-229b-4d05-b3e9-aaa8b26b08c4\" (UID: \"1f0db336-229b-4d05-b3e9-aaa8b26b08c4\") " Jan 31 09:29:00 crc kubenswrapper[4783]: I0131 09:29:00.956401 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-kube-api-access-gpmzs" (OuterVolumeSpecName: "kube-api-access-gpmzs") pod "1f0db336-229b-4d05-b3e9-aaa8b26b08c4" (UID: "1f0db336-229b-4d05-b3e9-aaa8b26b08c4"). InnerVolumeSpecName "kube-api-access-gpmzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:29:00 crc kubenswrapper[4783]: I0131 09:29:00.975678 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1f0db336-229b-4d05-b3e9-aaa8b26b08c4" (UID: "1f0db336-229b-4d05-b3e9-aaa8b26b08c4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:29:00 crc kubenswrapper[4783]: I0131 09:29:00.975961 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-inventory" (OuterVolumeSpecName: "inventory") pod "1f0db336-229b-4d05-b3e9-aaa8b26b08c4" (UID: "1f0db336-229b-4d05-b3e9-aaa8b26b08c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.053704 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.053738 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpmzs\" (UniqueName: \"kubernetes.io/projected/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-kube-api-access-gpmzs\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.053749 4783 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f0db336-229b-4d05-b3e9-aaa8b26b08c4-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.505585 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" event={"ID":"1f0db336-229b-4d05-b3e9-aaa8b26b08c4","Type":"ContainerDied","Data":"e51288ff6041905d9c535c238ffaedc588d7f52822354946aef624ac187ea37c"} Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.505624 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e51288ff6041905d9c535c238ffaedc588d7f52822354946aef624ac187ea37c" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.505625 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-97pwd" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.560754 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls"] Jan 31 09:29:01 crc kubenswrapper[4783]: E0131 09:29:01.561285 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0db336-229b-4d05-b3e9-aaa8b26b08c4" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.561361 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0db336-229b-4d05-b3e9-aaa8b26b08c4" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.561585 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f0db336-229b-4d05-b3e9-aaa8b26b08c4" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.562156 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.564276 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.565929 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.567273 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.567514 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.570105 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls"] Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.664576 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx5xm\" (UniqueName: \"kubernetes.io/projected/77285e40-3f9b-491d-a911-0f7b5c8058fb-kube-api-access-vx5xm\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c9hls\" (UID: \"77285e40-3f9b-491d-a911-0f7b5c8058fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.664964 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77285e40-3f9b-491d-a911-0f7b5c8058fb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c9hls\" (UID: \"77285e40-3f9b-491d-a911-0f7b5c8058fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.665140 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77285e40-3f9b-491d-a911-0f7b5c8058fb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c9hls\" (UID: \"77285e40-3f9b-491d-a911-0f7b5c8058fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.767332 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77285e40-3f9b-491d-a911-0f7b5c8058fb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c9hls\" (UID: \"77285e40-3f9b-491d-a911-0f7b5c8058fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.767407 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77285e40-3f9b-491d-a911-0f7b5c8058fb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c9hls\" (UID: \"77285e40-3f9b-491d-a911-0f7b5c8058fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.767466 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx5xm\" (UniqueName: \"kubernetes.io/projected/77285e40-3f9b-491d-a911-0f7b5c8058fb-kube-api-access-vx5xm\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c9hls\" (UID: \"77285e40-3f9b-491d-a911-0f7b5c8058fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.773222 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77285e40-3f9b-491d-a911-0f7b5c8058fb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c9hls\" (UID: \"77285e40-3f9b-491d-a911-0f7b5c8058fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.773555 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77285e40-3f9b-491d-a911-0f7b5c8058fb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c9hls\" (UID: \"77285e40-3f9b-491d-a911-0f7b5c8058fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.783373 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx5xm\" (UniqueName: \"kubernetes.io/projected/77285e40-3f9b-491d-a911-0f7b5c8058fb-kube-api-access-vx5xm\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c9hls\" (UID: \"77285e40-3f9b-491d-a911-0f7b5c8058fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" Jan 31 09:29:01 crc kubenswrapper[4783]: I0131 09:29:01.873817 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" Jan 31 09:29:02 crc kubenswrapper[4783]: I0131 09:29:02.334518 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls"] Jan 31 09:29:02 crc kubenswrapper[4783]: I0131 09:29:02.514498 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" event={"ID":"77285e40-3f9b-491d-a911-0f7b5c8058fb","Type":"ContainerStarted","Data":"3b071a528d22caa256cb537b757b2d893891d7b2a82d63097c1101bfa5c7cc79"} Jan 31 09:29:03 crc kubenswrapper[4783]: I0131 09:29:03.527078 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" event={"ID":"77285e40-3f9b-491d-a911-0f7b5c8058fb","Type":"ContainerStarted","Data":"13fb7076072b01489a52415450edac0ac52ef2f46aa8cb2e4c2248e4a5ff2fe9"} Jan 31 09:29:03 crc kubenswrapper[4783]: I0131 09:29:03.543862 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" podStartSLOduration=2.0667535519999998 podStartE2EDuration="2.543841487s" podCreationTimestamp="2026-01-31 09:29:01 +0000 UTC" firstStartedPulling="2026-01-31 09:29:02.340358411 +0000 UTC m=+1453.009041878" lastFinishedPulling="2026-01-31 09:29:02.817446345 +0000 UTC m=+1453.486129813" observedRunningTime="2026-01-31 09:29:03.543716812 +0000 UTC m=+1454.212400280" watchObservedRunningTime="2026-01-31 09:29:03.543841487 +0000 UTC m=+1454.212524955" Jan 31 09:29:07 crc kubenswrapper[4783]: I0131 09:29:07.037012 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gksrm"] Jan 31 09:29:07 crc kubenswrapper[4783]: I0131 09:29:07.044424 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-c5vpg"] Jan 31 09:29:07 crc kubenswrapper[4783]: I0131 09:29:07.054045 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gksrm"] Jan 31 09:29:07 crc kubenswrapper[4783]: I0131 09:29:07.060239 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-c5vpg"] Jan 31 09:29:07 crc kubenswrapper[4783]: I0131 09:29:07.657210 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be3abc10-848a-448b-a2a2-df825e99a23f" path="/var/lib/kubelet/pods/be3abc10-848a-448b-a2a2-df825e99a23f/volumes" Jan 31 09:29:07 crc kubenswrapper[4783]: I0131 09:29:07.658726 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1106ff1-8d0e-4e9f-ba62-d6a443279e0e" path="/var/lib/kubelet/pods/e1106ff1-8d0e-4e9f-ba62-d6a443279e0e/volumes" Jan 31 09:29:16 crc kubenswrapper[4783]: I0131 09:29:16.029850 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-k2pw7"] Jan 31 09:29:16 crc kubenswrapper[4783]: I0131 09:29:16.038417 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-k2pw7"] Jan 31 09:29:17 crc kubenswrapper[4783]: I0131 09:29:17.653864 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd0c0937-6461-4221-bbd8-3c7e37bbff9d" path="/var/lib/kubelet/pods/fd0c0937-6461-4221-bbd8-3c7e37bbff9d/volumes" Jan 31 09:29:19 crc kubenswrapper[4783]: I0131 09:29:19.028810 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-6c24v"] Jan 31 09:29:19 crc kubenswrapper[4783]: I0131 09:29:19.037984 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-6c24v"] Jan 31 09:29:19 crc kubenswrapper[4783]: I0131 09:29:19.658242 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6275a243-2cfc-4f77-a4b5-40a697e309d9" path="/var/lib/kubelet/pods/6275a243-2cfc-4f77-a4b5-40a697e309d9/volumes" Jan 31 09:29:23 crc kubenswrapper[4783]: I0131 09:29:23.890574 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w45qt"] Jan 31 09:29:23 crc kubenswrapper[4783]: I0131 09:29:23.892816 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:23 crc kubenswrapper[4783]: I0131 09:29:23.901809 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w45qt"] Jan 31 09:29:24 crc kubenswrapper[4783]: I0131 09:29:24.002477 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/472ab97c-fe08-41c8-840e-7510a20235b2-catalog-content\") pod \"certified-operators-w45qt\" (UID: \"472ab97c-fe08-41c8-840e-7510a20235b2\") " pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:24 crc kubenswrapper[4783]: I0131 09:29:24.002649 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/472ab97c-fe08-41c8-840e-7510a20235b2-utilities\") pod \"certified-operators-w45qt\" (UID: \"472ab97c-fe08-41c8-840e-7510a20235b2\") " pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:24 crc kubenswrapper[4783]: I0131 09:29:24.002794 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq6s5\" (UniqueName: \"kubernetes.io/projected/472ab97c-fe08-41c8-840e-7510a20235b2-kube-api-access-pq6s5\") pod \"certified-operators-w45qt\" (UID: \"472ab97c-fe08-41c8-840e-7510a20235b2\") " pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:24 crc kubenswrapper[4783]: I0131 09:29:24.105034 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/472ab97c-fe08-41c8-840e-7510a20235b2-catalog-content\") pod \"certified-operators-w45qt\" (UID: \"472ab97c-fe08-41c8-840e-7510a20235b2\") " pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:24 crc kubenswrapper[4783]: I0131 09:29:24.105155 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/472ab97c-fe08-41c8-840e-7510a20235b2-utilities\") pod \"certified-operators-w45qt\" (UID: \"472ab97c-fe08-41c8-840e-7510a20235b2\") " pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:24 crc kubenswrapper[4783]: I0131 09:29:24.105245 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq6s5\" (UniqueName: \"kubernetes.io/projected/472ab97c-fe08-41c8-840e-7510a20235b2-kube-api-access-pq6s5\") pod \"certified-operators-w45qt\" (UID: \"472ab97c-fe08-41c8-840e-7510a20235b2\") " pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:24 crc kubenswrapper[4783]: I0131 09:29:24.105550 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/472ab97c-fe08-41c8-840e-7510a20235b2-catalog-content\") pod \"certified-operators-w45qt\" (UID: \"472ab97c-fe08-41c8-840e-7510a20235b2\") " pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:24 crc kubenswrapper[4783]: I0131 09:29:24.105591 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/472ab97c-fe08-41c8-840e-7510a20235b2-utilities\") pod \"certified-operators-w45qt\" (UID: \"472ab97c-fe08-41c8-840e-7510a20235b2\") " pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:24 crc kubenswrapper[4783]: I0131 09:29:24.127241 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq6s5\" (UniqueName: \"kubernetes.io/projected/472ab97c-fe08-41c8-840e-7510a20235b2-kube-api-access-pq6s5\") pod \"certified-operators-w45qt\" (UID: \"472ab97c-fe08-41c8-840e-7510a20235b2\") " pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:24 crc kubenswrapper[4783]: I0131 09:29:24.211706 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:24 crc kubenswrapper[4783]: I0131 09:29:24.631834 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w45qt"] Jan 31 09:29:24 crc kubenswrapper[4783]: W0131 09:29:24.633975 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod472ab97c_fe08_41c8_840e_7510a20235b2.slice/crio-c5d356141eeb92f5e93b8edf9eea382be04b806ef735b99609d4875ca363683a WatchSource:0}: Error finding container c5d356141eeb92f5e93b8edf9eea382be04b806ef735b99609d4875ca363683a: Status 404 returned error can't find the container with id c5d356141eeb92f5e93b8edf9eea382be04b806ef735b99609d4875ca363683a Jan 31 09:29:24 crc kubenswrapper[4783]: I0131 09:29:24.700964 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w45qt" event={"ID":"472ab97c-fe08-41c8-840e-7510a20235b2","Type":"ContainerStarted","Data":"c5d356141eeb92f5e93b8edf9eea382be04b806ef735b99609d4875ca363683a"} Jan 31 09:29:25 crc kubenswrapper[4783]: I0131 09:29:25.712727 4783 generic.go:334] "Generic (PLEG): container finished" podID="472ab97c-fe08-41c8-840e-7510a20235b2" containerID="2a52e81a1cb39a72cf40aca02aa633bce9bb4992e1d49dfa75b5e589782c77bc" exitCode=0 Jan 31 09:29:25 crc kubenswrapper[4783]: I0131 09:29:25.712791 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w45qt" event={"ID":"472ab97c-fe08-41c8-840e-7510a20235b2","Type":"ContainerDied","Data":"2a52e81a1cb39a72cf40aca02aa633bce9bb4992e1d49dfa75b5e589782c77bc"} Jan 31 09:29:26 crc kubenswrapper[4783]: I0131 09:29:26.723876 4783 generic.go:334] "Generic (PLEG): container finished" podID="472ab97c-fe08-41c8-840e-7510a20235b2" containerID="7ed75e91fecf81158a5db6650ed3d2dfd7a349c2ee6686ccc0e53e4cbd6403e9" exitCode=0 Jan 31 09:29:26 crc kubenswrapper[4783]: I0131 09:29:26.724032 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w45qt" event={"ID":"472ab97c-fe08-41c8-840e-7510a20235b2","Type":"ContainerDied","Data":"7ed75e91fecf81158a5db6650ed3d2dfd7a349c2ee6686ccc0e53e4cbd6403e9"} Jan 31 09:29:27 crc kubenswrapper[4783]: I0131 09:29:27.736374 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w45qt" event={"ID":"472ab97c-fe08-41c8-840e-7510a20235b2","Type":"ContainerStarted","Data":"69d1427fb1b20143d8f9253549d6cfc94b23baf72c63e45369cde3ab0bb38c00"} Jan 31 09:29:27 crc kubenswrapper[4783]: I0131 09:29:27.759685 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w45qt" podStartSLOduration=3.300394913 podStartE2EDuration="4.759667032s" podCreationTimestamp="2026-01-31 09:29:23 +0000 UTC" firstStartedPulling="2026-01-31 09:29:25.714679851 +0000 UTC m=+1476.383363319" lastFinishedPulling="2026-01-31 09:29:27.173951969 +0000 UTC m=+1477.842635438" observedRunningTime="2026-01-31 09:29:27.75619775 +0000 UTC m=+1478.424881218" watchObservedRunningTime="2026-01-31 09:29:27.759667032 +0000 UTC m=+1478.428350500" Jan 31 09:29:29 crc kubenswrapper[4783]: I0131 09:29:29.753323 4783 generic.go:334] "Generic (PLEG): container finished" podID="77285e40-3f9b-491d-a911-0f7b5c8058fb" containerID="13fb7076072b01489a52415450edac0ac52ef2f46aa8cb2e4c2248e4a5ff2fe9" exitCode=0 Jan 31 09:29:29 crc kubenswrapper[4783]: I0131 09:29:29.753425 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" event={"ID":"77285e40-3f9b-491d-a911-0f7b5c8058fb","Type":"ContainerDied","Data":"13fb7076072b01489a52415450edac0ac52ef2f46aa8cb2e4c2248e4a5ff2fe9"} Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.089106 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.272542 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77285e40-3f9b-491d-a911-0f7b5c8058fb-inventory\") pod \"77285e40-3f9b-491d-a911-0f7b5c8058fb\" (UID: \"77285e40-3f9b-491d-a911-0f7b5c8058fb\") " Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.272800 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77285e40-3f9b-491d-a911-0f7b5c8058fb-ssh-key-openstack-edpm-ipam\") pod \"77285e40-3f9b-491d-a911-0f7b5c8058fb\" (UID: \"77285e40-3f9b-491d-a911-0f7b5c8058fb\") " Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.272855 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx5xm\" (UniqueName: \"kubernetes.io/projected/77285e40-3f9b-491d-a911-0f7b5c8058fb-kube-api-access-vx5xm\") pod \"77285e40-3f9b-491d-a911-0f7b5c8058fb\" (UID: \"77285e40-3f9b-491d-a911-0f7b5c8058fb\") " Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.282617 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77285e40-3f9b-491d-a911-0f7b5c8058fb-kube-api-access-vx5xm" (OuterVolumeSpecName: "kube-api-access-vx5xm") pod "77285e40-3f9b-491d-a911-0f7b5c8058fb" (UID: "77285e40-3f9b-491d-a911-0f7b5c8058fb"). InnerVolumeSpecName "kube-api-access-vx5xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.297459 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77285e40-3f9b-491d-a911-0f7b5c8058fb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "77285e40-3f9b-491d-a911-0f7b5c8058fb" (UID: "77285e40-3f9b-491d-a911-0f7b5c8058fb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.298568 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77285e40-3f9b-491d-a911-0f7b5c8058fb-inventory" (OuterVolumeSpecName: "inventory") pod "77285e40-3f9b-491d-a911-0f7b5c8058fb" (UID: "77285e40-3f9b-491d-a911-0f7b5c8058fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.374896 4783 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77285e40-3f9b-491d-a911-0f7b5c8058fb-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.374928 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77285e40-3f9b-491d-a911-0f7b5c8058fb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.374940 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx5xm\" (UniqueName: \"kubernetes.io/projected/77285e40-3f9b-491d-a911-0f7b5c8058fb-kube-api-access-vx5xm\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.771966 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" event={"ID":"77285e40-3f9b-491d-a911-0f7b5c8058fb","Type":"ContainerDied","Data":"3b071a528d22caa256cb537b757b2d893891d7b2a82d63097c1101bfa5c7cc79"} Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.772013 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b071a528d22caa256cb537b757b2d893891d7b2a82d63097c1101bfa5c7cc79" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.772071 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c9hls" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.836514 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj"] Jan 31 09:29:31 crc kubenswrapper[4783]: E0131 09:29:31.837009 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77285e40-3f9b-491d-a911-0f7b5c8058fb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.837030 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="77285e40-3f9b-491d-a911-0f7b5c8058fb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.837244 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="77285e40-3f9b-491d-a911-0f7b5c8058fb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.837892 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.839518 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.839856 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.839896 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.840210 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.843982 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj"] Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.985771 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/49b1076f-b620-47cc-8cbf-c70ecdbeab06-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj\" (UID: \"49b1076f-b620-47cc-8cbf-c70ecdbeab06\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.986129 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdv66\" (UniqueName: \"kubernetes.io/projected/49b1076f-b620-47cc-8cbf-c70ecdbeab06-kube-api-access-xdv66\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj\" (UID: \"49b1076f-b620-47cc-8cbf-c70ecdbeab06\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" Jan 31 09:29:31 crc kubenswrapper[4783]: I0131 09:29:31.986216 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b1076f-b620-47cc-8cbf-c70ecdbeab06-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj\" (UID: \"49b1076f-b620-47cc-8cbf-c70ecdbeab06\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" Jan 31 09:29:32 crc kubenswrapper[4783]: I0131 09:29:32.088157 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b1076f-b620-47cc-8cbf-c70ecdbeab06-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj\" (UID: \"49b1076f-b620-47cc-8cbf-c70ecdbeab06\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" Jan 31 09:29:32 crc kubenswrapper[4783]: I0131 09:29:32.088254 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/49b1076f-b620-47cc-8cbf-c70ecdbeab06-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj\" (UID: \"49b1076f-b620-47cc-8cbf-c70ecdbeab06\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" Jan 31 09:29:32 crc kubenswrapper[4783]: I0131 09:29:32.088328 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdv66\" (UniqueName: \"kubernetes.io/projected/49b1076f-b620-47cc-8cbf-c70ecdbeab06-kube-api-access-xdv66\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj\" (UID: \"49b1076f-b620-47cc-8cbf-c70ecdbeab06\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" Jan 31 09:29:32 crc kubenswrapper[4783]: I0131 09:29:32.096418 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b1076f-b620-47cc-8cbf-c70ecdbeab06-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj\" (UID: \"49b1076f-b620-47cc-8cbf-c70ecdbeab06\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" Jan 31 09:29:32 crc kubenswrapper[4783]: I0131 09:29:32.096448 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/49b1076f-b620-47cc-8cbf-c70ecdbeab06-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj\" (UID: \"49b1076f-b620-47cc-8cbf-c70ecdbeab06\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" Jan 31 09:29:32 crc kubenswrapper[4783]: I0131 09:29:32.101633 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdv66\" (UniqueName: \"kubernetes.io/projected/49b1076f-b620-47cc-8cbf-c70ecdbeab06-kube-api-access-xdv66\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj\" (UID: \"49b1076f-b620-47cc-8cbf-c70ecdbeab06\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" Jan 31 09:29:32 crc kubenswrapper[4783]: I0131 09:29:32.158055 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" Jan 31 09:29:32 crc kubenswrapper[4783]: I0131 09:29:32.607652 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj"] Jan 31 09:29:32 crc kubenswrapper[4783]: I0131 09:29:32.781619 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" event={"ID":"49b1076f-b620-47cc-8cbf-c70ecdbeab06","Type":"ContainerStarted","Data":"a339f07529d0b089facc481871c98becf994a531eda6ef480711252d0abadb2d"} Jan 31 09:29:33 crc kubenswrapper[4783]: I0131 09:29:33.789029 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" event={"ID":"49b1076f-b620-47cc-8cbf-c70ecdbeab06","Type":"ContainerStarted","Data":"f96126eb22da3c85edded9b05a6678550676d815b6beedb732b86eff8c6cd2eb"} Jan 31 09:29:33 crc kubenswrapper[4783]: I0131 09:29:33.818104 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" podStartSLOduration=2.3513292359999998 podStartE2EDuration="2.818087823s" podCreationTimestamp="2026-01-31 09:29:31 +0000 UTC" firstStartedPulling="2026-01-31 09:29:32.612080255 +0000 UTC m=+1483.280763722" lastFinishedPulling="2026-01-31 09:29:33.078838841 +0000 UTC m=+1483.747522309" observedRunningTime="2026-01-31 09:29:33.804673207 +0000 UTC m=+1484.473356675" watchObservedRunningTime="2026-01-31 09:29:33.818087823 +0000 UTC m=+1484.486771291" Jan 31 09:29:34 crc kubenswrapper[4783]: I0131 09:29:34.212082 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:34 crc kubenswrapper[4783]: I0131 09:29:34.212134 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:34 crc kubenswrapper[4783]: I0131 09:29:34.253799 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:34 crc kubenswrapper[4783]: I0131 09:29:34.839576 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:34 crc kubenswrapper[4783]: I0131 09:29:34.887120 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w45qt"] Jan 31 09:29:36 crc kubenswrapper[4783]: I0131 09:29:36.813708 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w45qt" podUID="472ab97c-fe08-41c8-840e-7510a20235b2" containerName="registry-server" containerID="cri-o://69d1427fb1b20143d8f9253549d6cfc94b23baf72c63e45369cde3ab0bb38c00" gracePeriod=2 Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.192649 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.294223 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/472ab97c-fe08-41c8-840e-7510a20235b2-utilities\") pod \"472ab97c-fe08-41c8-840e-7510a20235b2\" (UID: \"472ab97c-fe08-41c8-840e-7510a20235b2\") " Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.294326 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq6s5\" (UniqueName: \"kubernetes.io/projected/472ab97c-fe08-41c8-840e-7510a20235b2-kube-api-access-pq6s5\") pod \"472ab97c-fe08-41c8-840e-7510a20235b2\" (UID: \"472ab97c-fe08-41c8-840e-7510a20235b2\") " Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.294368 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/472ab97c-fe08-41c8-840e-7510a20235b2-catalog-content\") pod \"472ab97c-fe08-41c8-840e-7510a20235b2\" (UID: \"472ab97c-fe08-41c8-840e-7510a20235b2\") " Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.294828 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/472ab97c-fe08-41c8-840e-7510a20235b2-utilities" (OuterVolumeSpecName: "utilities") pod "472ab97c-fe08-41c8-840e-7510a20235b2" (UID: "472ab97c-fe08-41c8-840e-7510a20235b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.295802 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/472ab97c-fe08-41c8-840e-7510a20235b2-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.300593 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/472ab97c-fe08-41c8-840e-7510a20235b2-kube-api-access-pq6s5" (OuterVolumeSpecName: "kube-api-access-pq6s5") pod "472ab97c-fe08-41c8-840e-7510a20235b2" (UID: "472ab97c-fe08-41c8-840e-7510a20235b2"). InnerVolumeSpecName "kube-api-access-pq6s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.339480 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/472ab97c-fe08-41c8-840e-7510a20235b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "472ab97c-fe08-41c8-840e-7510a20235b2" (UID: "472ab97c-fe08-41c8-840e-7510a20235b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.396487 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq6s5\" (UniqueName: \"kubernetes.io/projected/472ab97c-fe08-41c8-840e-7510a20235b2-kube-api-access-pq6s5\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.396524 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/472ab97c-fe08-41c8-840e-7510a20235b2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.825207 4783 generic.go:334] "Generic (PLEG): container finished" podID="472ab97c-fe08-41c8-840e-7510a20235b2" containerID="69d1427fb1b20143d8f9253549d6cfc94b23baf72c63e45369cde3ab0bb38c00" exitCode=0 Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.825268 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w45qt" event={"ID":"472ab97c-fe08-41c8-840e-7510a20235b2","Type":"ContainerDied","Data":"69d1427fb1b20143d8f9253549d6cfc94b23baf72c63e45369cde3ab0bb38c00"} Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.825560 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w45qt" event={"ID":"472ab97c-fe08-41c8-840e-7510a20235b2","Type":"ContainerDied","Data":"c5d356141eeb92f5e93b8edf9eea382be04b806ef735b99609d4875ca363683a"} Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.825586 4783 scope.go:117] "RemoveContainer" containerID="69d1427fb1b20143d8f9253549d6cfc94b23baf72c63e45369cde3ab0bb38c00" Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.825348 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w45qt" Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.845773 4783 scope.go:117] "RemoveContainer" containerID="7ed75e91fecf81158a5db6650ed3d2dfd7a349c2ee6686ccc0e53e4cbd6403e9" Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.845917 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w45qt"] Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.852331 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w45qt"] Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.860674 4783 scope.go:117] "RemoveContainer" containerID="2a52e81a1cb39a72cf40aca02aa633bce9bb4992e1d49dfa75b5e589782c77bc" Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.893096 4783 scope.go:117] "RemoveContainer" containerID="69d1427fb1b20143d8f9253549d6cfc94b23baf72c63e45369cde3ab0bb38c00" Jan 31 09:29:37 crc kubenswrapper[4783]: E0131 09:29:37.894173 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d1427fb1b20143d8f9253549d6cfc94b23baf72c63e45369cde3ab0bb38c00\": container with ID starting with 69d1427fb1b20143d8f9253549d6cfc94b23baf72c63e45369cde3ab0bb38c00 not found: ID does not exist" containerID="69d1427fb1b20143d8f9253549d6cfc94b23baf72c63e45369cde3ab0bb38c00" Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.894217 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d1427fb1b20143d8f9253549d6cfc94b23baf72c63e45369cde3ab0bb38c00"} err="failed to get container status \"69d1427fb1b20143d8f9253549d6cfc94b23baf72c63e45369cde3ab0bb38c00\": rpc error: code = NotFound desc = could not find container \"69d1427fb1b20143d8f9253549d6cfc94b23baf72c63e45369cde3ab0bb38c00\": container with ID starting with 69d1427fb1b20143d8f9253549d6cfc94b23baf72c63e45369cde3ab0bb38c00 not found: ID does not exist" Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.894246 4783 scope.go:117] "RemoveContainer" containerID="7ed75e91fecf81158a5db6650ed3d2dfd7a349c2ee6686ccc0e53e4cbd6403e9" Jan 31 09:29:37 crc kubenswrapper[4783]: E0131 09:29:37.894606 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ed75e91fecf81158a5db6650ed3d2dfd7a349c2ee6686ccc0e53e4cbd6403e9\": container with ID starting with 7ed75e91fecf81158a5db6650ed3d2dfd7a349c2ee6686ccc0e53e4cbd6403e9 not found: ID does not exist" containerID="7ed75e91fecf81158a5db6650ed3d2dfd7a349c2ee6686ccc0e53e4cbd6403e9" Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.894641 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ed75e91fecf81158a5db6650ed3d2dfd7a349c2ee6686ccc0e53e4cbd6403e9"} err="failed to get container status \"7ed75e91fecf81158a5db6650ed3d2dfd7a349c2ee6686ccc0e53e4cbd6403e9\": rpc error: code = NotFound desc = could not find container \"7ed75e91fecf81158a5db6650ed3d2dfd7a349c2ee6686ccc0e53e4cbd6403e9\": container with ID starting with 7ed75e91fecf81158a5db6650ed3d2dfd7a349c2ee6686ccc0e53e4cbd6403e9 not found: ID does not exist" Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.894662 4783 scope.go:117] "RemoveContainer" containerID="2a52e81a1cb39a72cf40aca02aa633bce9bb4992e1d49dfa75b5e589782c77bc" Jan 31 09:29:37 crc kubenswrapper[4783]: E0131 09:29:37.894971 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a52e81a1cb39a72cf40aca02aa633bce9bb4992e1d49dfa75b5e589782c77bc\": container with ID starting with 2a52e81a1cb39a72cf40aca02aa633bce9bb4992e1d49dfa75b5e589782c77bc not found: ID does not exist" containerID="2a52e81a1cb39a72cf40aca02aa633bce9bb4992e1d49dfa75b5e589782c77bc" Jan 31 09:29:37 crc kubenswrapper[4783]: I0131 09:29:37.894993 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a52e81a1cb39a72cf40aca02aa633bce9bb4992e1d49dfa75b5e589782c77bc"} err="failed to get container status \"2a52e81a1cb39a72cf40aca02aa633bce9bb4992e1d49dfa75b5e589782c77bc\": rpc error: code = NotFound desc = could not find container \"2a52e81a1cb39a72cf40aca02aa633bce9bb4992e1d49dfa75b5e589782c77bc\": container with ID starting with 2a52e81a1cb39a72cf40aca02aa633bce9bb4992e1d49dfa75b5e589782c77bc not found: ID does not exist" Jan 31 09:29:39 crc kubenswrapper[4783]: I0131 09:29:39.654598 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="472ab97c-fe08-41c8-840e-7510a20235b2" path="/var/lib/kubelet/pods/472ab97c-fe08-41c8-840e-7510a20235b2/volumes" Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.334958 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gp896"] Jan 31 09:29:50 crc kubenswrapper[4783]: E0131 09:29:50.341378 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472ab97c-fe08-41c8-840e-7510a20235b2" containerName="registry-server" Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.341395 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="472ab97c-fe08-41c8-840e-7510a20235b2" containerName="registry-server" Jan 31 09:29:50 crc kubenswrapper[4783]: E0131 09:29:50.341411 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472ab97c-fe08-41c8-840e-7510a20235b2" containerName="extract-content" Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.341417 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="472ab97c-fe08-41c8-840e-7510a20235b2" containerName="extract-content" Jan 31 09:29:50 crc kubenswrapper[4783]: E0131 09:29:50.341440 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472ab97c-fe08-41c8-840e-7510a20235b2" containerName="extract-utilities" Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.341446 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="472ab97c-fe08-41c8-840e-7510a20235b2" containerName="extract-utilities" Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.341615 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="472ab97c-fe08-41c8-840e-7510a20235b2" containerName="registry-server" Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.342943 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.350255 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gp896"] Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.453854 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6b3ef10-20aa-4274-9fbe-277f97c80640-catalog-content\") pod \"redhat-operators-gp896\" (UID: \"e6b3ef10-20aa-4274-9fbe-277f97c80640\") " pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.453957 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrmq7\" (UniqueName: \"kubernetes.io/projected/e6b3ef10-20aa-4274-9fbe-277f97c80640-kube-api-access-jrmq7\") pod \"redhat-operators-gp896\" (UID: \"e6b3ef10-20aa-4274-9fbe-277f97c80640\") " pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.453996 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6b3ef10-20aa-4274-9fbe-277f97c80640-utilities\") pod \"redhat-operators-gp896\" (UID: \"e6b3ef10-20aa-4274-9fbe-277f97c80640\") " pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.555362 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6b3ef10-20aa-4274-9fbe-277f97c80640-catalog-content\") pod \"redhat-operators-gp896\" (UID: \"e6b3ef10-20aa-4274-9fbe-277f97c80640\") " pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.555446 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrmq7\" (UniqueName: \"kubernetes.io/projected/e6b3ef10-20aa-4274-9fbe-277f97c80640-kube-api-access-jrmq7\") pod \"redhat-operators-gp896\" (UID: \"e6b3ef10-20aa-4274-9fbe-277f97c80640\") " pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.555485 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6b3ef10-20aa-4274-9fbe-277f97c80640-utilities\") pod \"redhat-operators-gp896\" (UID: \"e6b3ef10-20aa-4274-9fbe-277f97c80640\") " pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.555796 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6b3ef10-20aa-4274-9fbe-277f97c80640-catalog-content\") pod \"redhat-operators-gp896\" (UID: \"e6b3ef10-20aa-4274-9fbe-277f97c80640\") " pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.555877 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6b3ef10-20aa-4274-9fbe-277f97c80640-utilities\") pod \"redhat-operators-gp896\" (UID: \"e6b3ef10-20aa-4274-9fbe-277f97c80640\") " pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.574906 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrmq7\" (UniqueName: \"kubernetes.io/projected/e6b3ef10-20aa-4274-9fbe-277f97c80640-kube-api-access-jrmq7\") pod \"redhat-operators-gp896\" (UID: \"e6b3ef10-20aa-4274-9fbe-277f97c80640\") " pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:29:50 crc kubenswrapper[4783]: I0131 09:29:50.657101 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:29:51 crc kubenswrapper[4783]: I0131 09:29:51.070368 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gp896"] Jan 31 09:29:51 crc kubenswrapper[4783]: W0131 09:29:51.080300 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6b3ef10_20aa_4274_9fbe_277f97c80640.slice/crio-be094fc382754e5f4765d3ba58cdef125f416011b3301272a5f3b0d14bb41d44 WatchSource:0}: Error finding container be094fc382754e5f4765d3ba58cdef125f416011b3301272a5f3b0d14bb41d44: Status 404 returned error can't find the container with id be094fc382754e5f4765d3ba58cdef125f416011b3301272a5f3b0d14bb41d44 Jan 31 09:29:51 crc kubenswrapper[4783]: I0131 09:29:51.934075 4783 generic.go:334] "Generic (PLEG): container finished" podID="e6b3ef10-20aa-4274-9fbe-277f97c80640" containerID="c4d36be0cdb903087caa213a84f92af5315fc9c1d22d7522d6f0191ad62bd522" exitCode=0 Jan 31 09:29:51 crc kubenswrapper[4783]: I0131 09:29:51.934131 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp896" event={"ID":"e6b3ef10-20aa-4274-9fbe-277f97c80640","Type":"ContainerDied","Data":"c4d36be0cdb903087caa213a84f92af5315fc9c1d22d7522d6f0191ad62bd522"} Jan 31 09:29:51 crc kubenswrapper[4783]: I0131 09:29:51.934181 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp896" event={"ID":"e6b3ef10-20aa-4274-9fbe-277f97c80640","Type":"ContainerStarted","Data":"be094fc382754e5f4765d3ba58cdef125f416011b3301272a5f3b0d14bb41d44"} Jan 31 09:29:52 crc kubenswrapper[4783]: I0131 09:29:52.944993 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp896" event={"ID":"e6b3ef10-20aa-4274-9fbe-277f97c80640","Type":"ContainerStarted","Data":"9e4e0902accd24a3a914373500c814251f07ea2e8d8ad5ef62f99af356a74e86"} Jan 31 09:29:54 crc kubenswrapper[4783]: I0131 09:29:54.967824 4783 generic.go:334] "Generic (PLEG): container finished" podID="e6b3ef10-20aa-4274-9fbe-277f97c80640" containerID="9e4e0902accd24a3a914373500c814251f07ea2e8d8ad5ef62f99af356a74e86" exitCode=0 Jan 31 09:29:54 crc kubenswrapper[4783]: I0131 09:29:54.967867 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp896" event={"ID":"e6b3ef10-20aa-4274-9fbe-277f97c80640","Type":"ContainerDied","Data":"9e4e0902accd24a3a914373500c814251f07ea2e8d8ad5ef62f99af356a74e86"} Jan 31 09:29:55 crc kubenswrapper[4783]: I0131 09:29:55.982732 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp896" event={"ID":"e6b3ef10-20aa-4274-9fbe-277f97c80640","Type":"ContainerStarted","Data":"e5865c560300a237874970d6eb801203c409bf6a0f50eeb0c249f0146de3bafe"} Jan 31 09:29:56 crc kubenswrapper[4783]: I0131 09:29:56.008410 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gp896" podStartSLOduration=2.434770543 podStartE2EDuration="6.008392355s" podCreationTimestamp="2026-01-31 09:29:50 +0000 UTC" firstStartedPulling="2026-01-31 09:29:51.935765982 +0000 UTC m=+1502.604449449" lastFinishedPulling="2026-01-31 09:29:55.509387793 +0000 UTC m=+1506.178071261" observedRunningTime="2026-01-31 09:29:56.000405243 +0000 UTC m=+1506.669088721" watchObservedRunningTime="2026-01-31 09:29:56.008392355 +0000 UTC m=+1506.677075824" Jan 31 09:29:57 crc kubenswrapper[4783]: I0131 09:29:57.051217 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-bf84-account-create-update-nq754"] Jan 31 09:29:57 crc kubenswrapper[4783]: I0131 09:29:57.056962 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-bf84-account-create-update-nq754"] Jan 31 09:29:57 crc kubenswrapper[4783]: I0131 09:29:57.664385 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1094fbb7-1b91-4925-85b4-c9dafceb46c9" path="/var/lib/kubelet/pods/1094fbb7-1b91-4925-85b4-c9dafceb46c9/volumes" Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.034311 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-6717-account-create-update-p4bsh"] Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.040134 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-krxr5"] Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.050370 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-6717-account-create-update-p4bsh"] Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.055317 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-krxr5"] Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.061931 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f95f-account-create-update-275t8"] Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.067027 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-2s7lh"] Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.071638 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-pqf78"] Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.076507 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-f95f-account-create-update-275t8"] Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.081141 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-2s7lh"] Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.088034 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-pqf78"] Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.118762 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n84cd"] Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.120399 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.129577 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n84cd"] Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.225269 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb26c78d-8e92-4875-a3e4-d58dece2bb26-catalog-content\") pod \"redhat-marketplace-n84cd\" (UID: \"fb26c78d-8e92-4875-a3e4-d58dece2bb26\") " pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.225759 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb26c78d-8e92-4875-a3e4-d58dece2bb26-utilities\") pod \"redhat-marketplace-n84cd\" (UID: \"fb26c78d-8e92-4875-a3e4-d58dece2bb26\") " pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.225812 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6q49\" (UniqueName: \"kubernetes.io/projected/fb26c78d-8e92-4875-a3e4-d58dece2bb26-kube-api-access-q6q49\") pod \"redhat-marketplace-n84cd\" (UID: \"fb26c78d-8e92-4875-a3e4-d58dece2bb26\") " pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.327354 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb26c78d-8e92-4875-a3e4-d58dece2bb26-catalog-content\") pod \"redhat-marketplace-n84cd\" (UID: \"fb26c78d-8e92-4875-a3e4-d58dece2bb26\") " pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.327424 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb26c78d-8e92-4875-a3e4-d58dece2bb26-utilities\") pod \"redhat-marketplace-n84cd\" (UID: \"fb26c78d-8e92-4875-a3e4-d58dece2bb26\") " pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.327459 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6q49\" (UniqueName: \"kubernetes.io/projected/fb26c78d-8e92-4875-a3e4-d58dece2bb26-kube-api-access-q6q49\") pod \"redhat-marketplace-n84cd\" (UID: \"fb26c78d-8e92-4875-a3e4-d58dece2bb26\") " pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.327870 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb26c78d-8e92-4875-a3e4-d58dece2bb26-catalog-content\") pod \"redhat-marketplace-n84cd\" (UID: \"fb26c78d-8e92-4875-a3e4-d58dece2bb26\") " pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.327921 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb26c78d-8e92-4875-a3e4-d58dece2bb26-utilities\") pod \"redhat-marketplace-n84cd\" (UID: \"fb26c78d-8e92-4875-a3e4-d58dece2bb26\") " pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.345686 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6q49\" (UniqueName: \"kubernetes.io/projected/fb26c78d-8e92-4875-a3e4-d58dece2bb26-kube-api-access-q6q49\") pod \"redhat-marketplace-n84cd\" (UID: \"fb26c78d-8e92-4875-a3e4-d58dece2bb26\") " pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.434575 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:29:58 crc kubenswrapper[4783]: W0131 09:29:58.899488 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb26c78d_8e92_4875_a3e4_d58dece2bb26.slice/crio-aad202e565b8b1caeadda02246e69d79468e9f81d6a1af74c8ff5889e053b454 WatchSource:0}: Error finding container aad202e565b8b1caeadda02246e69d79468e9f81d6a1af74c8ff5889e053b454: Status 404 returned error can't find the container with id aad202e565b8b1caeadda02246e69d79468e9f81d6a1af74c8ff5889e053b454 Jan 31 09:29:58 crc kubenswrapper[4783]: I0131 09:29:58.909517 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n84cd"] Jan 31 09:29:59 crc kubenswrapper[4783]: I0131 09:29:59.011917 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n84cd" event={"ID":"fb26c78d-8e92-4875-a3e4-d58dece2bb26","Type":"ContainerStarted","Data":"aad202e565b8b1caeadda02246e69d79468e9f81d6a1af74c8ff5889e053b454"} Jan 31 09:29:59 crc kubenswrapper[4783]: I0131 09:29:59.657565 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="149825bf-5cac-45b3-a51f-f569f43fa5d0" path="/var/lib/kubelet/pods/149825bf-5cac-45b3-a51f-f569f43fa5d0/volumes" Jan 31 09:29:59 crc kubenswrapper[4783]: I0131 09:29:59.658750 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35ed13cd-8052-462d-bbb3-7d2863d38c2e" path="/var/lib/kubelet/pods/35ed13cd-8052-462d-bbb3-7d2863d38c2e/volumes" Jan 31 09:29:59 crc kubenswrapper[4783]: I0131 09:29:59.659435 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46eac665-6761-4ff3-8718-6417ccea545d" path="/var/lib/kubelet/pods/46eac665-6761-4ff3-8718-6417ccea545d/volumes" Jan 31 09:29:59 crc kubenswrapper[4783]: I0131 09:29:59.660067 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6263f4cc-b742-4057-96a4-d4a058ad3f44" path="/var/lib/kubelet/pods/6263f4cc-b742-4057-96a4-d4a058ad3f44/volumes" Jan 31 09:29:59 crc kubenswrapper[4783]: I0131 09:29:59.661300 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7764a47f-6ccf-43f1-a787-99db87fb5cfb" path="/var/lib/kubelet/pods/7764a47f-6ccf-43f1-a787-99db87fb5cfb/volumes" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.022845 4783 generic.go:334] "Generic (PLEG): container finished" podID="fb26c78d-8e92-4875-a3e4-d58dece2bb26" containerID="192546d7187326409be2344f67c5dd519625390d6999def1a1b6ba821fb9ec42" exitCode=0 Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.022927 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n84cd" event={"ID":"fb26c78d-8e92-4875-a3e4-d58dece2bb26","Type":"ContainerDied","Data":"192546d7187326409be2344f67c5dd519625390d6999def1a1b6ba821fb9ec42"} Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.064259 4783 scope.go:117] "RemoveContainer" containerID="d594b64cb69b29074ba9f06f5ac28af6cff153585d9b4b4443c8ec1eabe1344d" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.092803 4783 scope.go:117] "RemoveContainer" containerID="a0c7adbdcdc3c32262ec0c55d243958625d2a64ec655977717728d348bf879a4" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.140755 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk"] Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.142173 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.144974 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.145213 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.155749 4783 scope.go:117] "RemoveContainer" containerID="a1a0890df4932c8a0a7bc5c1f7ec446cf7562b17edfd7b01d7fd0a026d2b11d6" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.157485 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk"] Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.190917 4783 scope.go:117] "RemoveContainer" containerID="4a25d7ae9d70db3cf6369f78fc4d5132e8bdc523ca1308ad2b307efc5fa7e3f6" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.259552 4783 scope.go:117] "RemoveContainer" containerID="634e4bb99cd4f4c0685080200c21099faa5c7885981ac72b80bb0ec74397d192" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.279857 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6680368a-e3b2-43da-8a55-a781828c8a01-config-volume\") pod \"collect-profiles-29497530-wnwnk\" (UID: \"6680368a-e3b2-43da-8a55-a781828c8a01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.279998 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkjsn\" (UniqueName: \"kubernetes.io/projected/6680368a-e3b2-43da-8a55-a781828c8a01-kube-api-access-qkjsn\") pod \"collect-profiles-29497530-wnwnk\" (UID: \"6680368a-e3b2-43da-8a55-a781828c8a01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.280071 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6680368a-e3b2-43da-8a55-a781828c8a01-secret-volume\") pod \"collect-profiles-29497530-wnwnk\" (UID: \"6680368a-e3b2-43da-8a55-a781828c8a01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.285414 4783 scope.go:117] "RemoveContainer" containerID="03f3988b3e5c6a5472626eea137ca26b65b5e4f8cd826d1c8aa1c55b189b9f82" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.335069 4783 scope.go:117] "RemoveContainer" containerID="45efa1253dcd16e9db539aabbeb329901132f434749be449e60227f00738a351" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.352755 4783 scope.go:117] "RemoveContainer" containerID="4afa1e0517eca9bcff31824e306355561432ec3e2e411eb39b30ad844d68d49e" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.373517 4783 scope.go:117] "RemoveContainer" containerID="6834fb766d782505c282c414f27930a36a5e6731b75a0214876f329d8440066e" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.382699 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6680368a-e3b2-43da-8a55-a781828c8a01-config-volume\") pod \"collect-profiles-29497530-wnwnk\" (UID: \"6680368a-e3b2-43da-8a55-a781828c8a01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.382802 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkjsn\" (UniqueName: \"kubernetes.io/projected/6680368a-e3b2-43da-8a55-a781828c8a01-kube-api-access-qkjsn\") pod \"collect-profiles-29497530-wnwnk\" (UID: \"6680368a-e3b2-43da-8a55-a781828c8a01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.382855 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6680368a-e3b2-43da-8a55-a781828c8a01-secret-volume\") pod \"collect-profiles-29497530-wnwnk\" (UID: \"6680368a-e3b2-43da-8a55-a781828c8a01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.383935 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6680368a-e3b2-43da-8a55-a781828c8a01-config-volume\") pod \"collect-profiles-29497530-wnwnk\" (UID: \"6680368a-e3b2-43da-8a55-a781828c8a01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.393881 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6680368a-e3b2-43da-8a55-a781828c8a01-secret-volume\") pod \"collect-profiles-29497530-wnwnk\" (UID: \"6680368a-e3b2-43da-8a55-a781828c8a01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.395992 4783 scope.go:117] "RemoveContainer" containerID="1a4cb936a7541f68ef79f19c0624411d54eb926fab985b9167b8322f644300b8" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.397128 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkjsn\" (UniqueName: \"kubernetes.io/projected/6680368a-e3b2-43da-8a55-a781828c8a01-kube-api-access-qkjsn\") pod \"collect-profiles-29497530-wnwnk\" (UID: \"6680368a-e3b2-43da-8a55-a781828c8a01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.555478 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.659891 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:30:00 crc kubenswrapper[4783]: I0131 09:30:00.660193 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:30:01 crc kubenswrapper[4783]: I0131 09:30:01.027874 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk"] Jan 31 09:30:01 crc kubenswrapper[4783]: W0131 09:30:01.028664 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6680368a_e3b2_43da_8a55_a781828c8a01.slice/crio-415760b5b4ce9319a67cddf6baff6dd5d86327e412f08909abee76820b333171 WatchSource:0}: Error finding container 415760b5b4ce9319a67cddf6baff6dd5d86327e412f08909abee76820b333171: Status 404 returned error can't find the container with id 415760b5b4ce9319a67cddf6baff6dd5d86327e412f08909abee76820b333171 Jan 31 09:30:01 crc kubenswrapper[4783]: I0131 09:30:01.031438 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n84cd" event={"ID":"fb26c78d-8e92-4875-a3e4-d58dece2bb26","Type":"ContainerStarted","Data":"5392721d6f51f9796b195b5c629c93d3c8ee21434a9fdb31992fb83811c3928e"} Jan 31 09:30:01 crc kubenswrapper[4783]: I0131 09:30:01.707728 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gp896" podUID="e6b3ef10-20aa-4274-9fbe-277f97c80640" containerName="registry-server" probeResult="failure" output=< Jan 31 09:30:01 crc kubenswrapper[4783]: timeout: failed to connect service ":50051" within 1s Jan 31 09:30:01 crc kubenswrapper[4783]: > Jan 31 09:30:02 crc kubenswrapper[4783]: I0131 09:30:02.040852 4783 generic.go:334] "Generic (PLEG): container finished" podID="fb26c78d-8e92-4875-a3e4-d58dece2bb26" containerID="5392721d6f51f9796b195b5c629c93d3c8ee21434a9fdb31992fb83811c3928e" exitCode=0 Jan 31 09:30:02 crc kubenswrapper[4783]: I0131 09:30:02.040954 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n84cd" event={"ID":"fb26c78d-8e92-4875-a3e4-d58dece2bb26","Type":"ContainerDied","Data":"5392721d6f51f9796b195b5c629c93d3c8ee21434a9fdb31992fb83811c3928e"} Jan 31 09:30:02 crc kubenswrapper[4783]: I0131 09:30:02.042462 4783 generic.go:334] "Generic (PLEG): container finished" podID="6680368a-e3b2-43da-8a55-a781828c8a01" containerID="d9de0c713605f43657fefee0a4553f6d40c8ec269f3bf93b334f8ad279116eaf" exitCode=0 Jan 31 09:30:02 crc kubenswrapper[4783]: I0131 09:30:02.042499 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk" event={"ID":"6680368a-e3b2-43da-8a55-a781828c8a01","Type":"ContainerDied","Data":"d9de0c713605f43657fefee0a4553f6d40c8ec269f3bf93b334f8ad279116eaf"} Jan 31 09:30:02 crc kubenswrapper[4783]: I0131 09:30:02.042533 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk" event={"ID":"6680368a-e3b2-43da-8a55-a781828c8a01","Type":"ContainerStarted","Data":"415760b5b4ce9319a67cddf6baff6dd5d86327e412f08909abee76820b333171"} Jan 31 09:30:03 crc kubenswrapper[4783]: I0131 09:30:03.055531 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n84cd" event={"ID":"fb26c78d-8e92-4875-a3e4-d58dece2bb26","Type":"ContainerStarted","Data":"687aa0bc06c79d8b76eff20e0424a9218709bf19dcb547fb6ccb9bdfb83347b7"} Jan 31 09:30:03 crc kubenswrapper[4783]: I0131 09:30:03.078672 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n84cd" podStartSLOduration=2.572857829 podStartE2EDuration="5.07864624s" podCreationTimestamp="2026-01-31 09:29:58 +0000 UTC" firstStartedPulling="2026-01-31 09:30:00.025236814 +0000 UTC m=+1510.693920272" lastFinishedPulling="2026-01-31 09:30:02.531025215 +0000 UTC m=+1513.199708683" observedRunningTime="2026-01-31 09:30:03.071023966 +0000 UTC m=+1513.739707434" watchObservedRunningTime="2026-01-31 09:30:03.07864624 +0000 UTC m=+1513.747329708" Jan 31 09:30:03 crc kubenswrapper[4783]: I0131 09:30:03.337904 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk" Jan 31 09:30:03 crc kubenswrapper[4783]: I0131 09:30:03.461015 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6680368a-e3b2-43da-8a55-a781828c8a01-secret-volume\") pod \"6680368a-e3b2-43da-8a55-a781828c8a01\" (UID: \"6680368a-e3b2-43da-8a55-a781828c8a01\") " Jan 31 09:30:03 crc kubenswrapper[4783]: I0131 09:30:03.461841 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6680368a-e3b2-43da-8a55-a781828c8a01-config-volume\") pod \"6680368a-e3b2-43da-8a55-a781828c8a01\" (UID: \"6680368a-e3b2-43da-8a55-a781828c8a01\") " Jan 31 09:30:03 crc kubenswrapper[4783]: I0131 09:30:03.461890 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkjsn\" (UniqueName: \"kubernetes.io/projected/6680368a-e3b2-43da-8a55-a781828c8a01-kube-api-access-qkjsn\") pod \"6680368a-e3b2-43da-8a55-a781828c8a01\" (UID: \"6680368a-e3b2-43da-8a55-a781828c8a01\") " Jan 31 09:30:03 crc kubenswrapper[4783]: I0131 09:30:03.462628 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6680368a-e3b2-43da-8a55-a781828c8a01-config-volume" (OuterVolumeSpecName: "config-volume") pod "6680368a-e3b2-43da-8a55-a781828c8a01" (UID: "6680368a-e3b2-43da-8a55-a781828c8a01"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:03 crc kubenswrapper[4783]: I0131 09:30:03.469083 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6680368a-e3b2-43da-8a55-a781828c8a01-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6680368a-e3b2-43da-8a55-a781828c8a01" (UID: "6680368a-e3b2-43da-8a55-a781828c8a01"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:03 crc kubenswrapper[4783]: I0131 09:30:03.469142 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6680368a-e3b2-43da-8a55-a781828c8a01-kube-api-access-qkjsn" (OuterVolumeSpecName: "kube-api-access-qkjsn") pod "6680368a-e3b2-43da-8a55-a781828c8a01" (UID: "6680368a-e3b2-43da-8a55-a781828c8a01"). InnerVolumeSpecName "kube-api-access-qkjsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:30:03 crc kubenswrapper[4783]: I0131 09:30:03.565189 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkjsn\" (UniqueName: \"kubernetes.io/projected/6680368a-e3b2-43da-8a55-a781828c8a01-kube-api-access-qkjsn\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:03 crc kubenswrapper[4783]: I0131 09:30:03.565391 4783 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6680368a-e3b2-43da-8a55-a781828c8a01-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:03 crc kubenswrapper[4783]: I0131 09:30:03.565486 4783 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6680368a-e3b2-43da-8a55-a781828c8a01-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:04 crc kubenswrapper[4783]: I0131 09:30:04.067484 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk" event={"ID":"6680368a-e3b2-43da-8a55-a781828c8a01","Type":"ContainerDied","Data":"415760b5b4ce9319a67cddf6baff6dd5d86327e412f08909abee76820b333171"} Jan 31 09:30:04 crc kubenswrapper[4783]: I0131 09:30:04.067940 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="415760b5b4ce9319a67cddf6baff6dd5d86327e412f08909abee76820b333171" Jan 31 09:30:04 crc kubenswrapper[4783]: I0131 09:30:04.067526 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-wnwnk" Jan 31 09:30:08 crc kubenswrapper[4783]: I0131 09:30:08.104502 4783 generic.go:334] "Generic (PLEG): container finished" podID="49b1076f-b620-47cc-8cbf-c70ecdbeab06" containerID="f96126eb22da3c85edded9b05a6678550676d815b6beedb732b86eff8c6cd2eb" exitCode=0 Jan 31 09:30:08 crc kubenswrapper[4783]: I0131 09:30:08.104591 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" event={"ID":"49b1076f-b620-47cc-8cbf-c70ecdbeab06","Type":"ContainerDied","Data":"f96126eb22da3c85edded9b05a6678550676d815b6beedb732b86eff8c6cd2eb"} Jan 31 09:30:08 crc kubenswrapper[4783]: I0131 09:30:08.434836 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:30:08 crc kubenswrapper[4783]: I0131 09:30:08.435295 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:30:08 crc kubenswrapper[4783]: I0131 09:30:08.534907 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:30:09 crc kubenswrapper[4783]: I0131 09:30:09.155551 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:30:09 crc kubenswrapper[4783]: I0131 09:30:09.198259 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n84cd"] Jan 31 09:30:09 crc kubenswrapper[4783]: I0131 09:30:09.465364 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" Jan 31 09:30:09 crc kubenswrapper[4783]: I0131 09:30:09.593634 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/49b1076f-b620-47cc-8cbf-c70ecdbeab06-ssh-key-openstack-edpm-ipam\") pod \"49b1076f-b620-47cc-8cbf-c70ecdbeab06\" (UID: \"49b1076f-b620-47cc-8cbf-c70ecdbeab06\") " Jan 31 09:30:09 crc kubenswrapper[4783]: I0131 09:30:09.593828 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b1076f-b620-47cc-8cbf-c70ecdbeab06-inventory\") pod \"49b1076f-b620-47cc-8cbf-c70ecdbeab06\" (UID: \"49b1076f-b620-47cc-8cbf-c70ecdbeab06\") " Jan 31 09:30:09 crc kubenswrapper[4783]: I0131 09:30:09.593947 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdv66\" (UniqueName: \"kubernetes.io/projected/49b1076f-b620-47cc-8cbf-c70ecdbeab06-kube-api-access-xdv66\") pod \"49b1076f-b620-47cc-8cbf-c70ecdbeab06\" (UID: \"49b1076f-b620-47cc-8cbf-c70ecdbeab06\") " Jan 31 09:30:09 crc kubenswrapper[4783]: I0131 09:30:09.600223 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b1076f-b620-47cc-8cbf-c70ecdbeab06-kube-api-access-xdv66" (OuterVolumeSpecName: "kube-api-access-xdv66") pod "49b1076f-b620-47cc-8cbf-c70ecdbeab06" (UID: "49b1076f-b620-47cc-8cbf-c70ecdbeab06"). InnerVolumeSpecName "kube-api-access-xdv66". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:30:09 crc kubenswrapper[4783]: I0131 09:30:09.618726 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b1076f-b620-47cc-8cbf-c70ecdbeab06-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "49b1076f-b620-47cc-8cbf-c70ecdbeab06" (UID: "49b1076f-b620-47cc-8cbf-c70ecdbeab06"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:09 crc kubenswrapper[4783]: I0131 09:30:09.620109 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b1076f-b620-47cc-8cbf-c70ecdbeab06-inventory" (OuterVolumeSpecName: "inventory") pod "49b1076f-b620-47cc-8cbf-c70ecdbeab06" (UID: "49b1076f-b620-47cc-8cbf-c70ecdbeab06"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:09 crc kubenswrapper[4783]: I0131 09:30:09.698085 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/49b1076f-b620-47cc-8cbf-c70ecdbeab06-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:09 crc kubenswrapper[4783]: I0131 09:30:09.698136 4783 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b1076f-b620-47cc-8cbf-c70ecdbeab06-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:09 crc kubenswrapper[4783]: I0131 09:30:09.698148 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdv66\" (UniqueName: \"kubernetes.io/projected/49b1076f-b620-47cc-8cbf-c70ecdbeab06-kube-api-access-xdv66\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.133306 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" event={"ID":"49b1076f-b620-47cc-8cbf-c70ecdbeab06","Type":"ContainerDied","Data":"a339f07529d0b089facc481871c98becf994a531eda6ef480711252d0abadb2d"} Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.133723 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a339f07529d0b089facc481871c98becf994a531eda6ef480711252d0abadb2d" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.133819 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.204134 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-bmbrc"] Jan 31 09:30:10 crc kubenswrapper[4783]: E0131 09:30:10.204720 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6680368a-e3b2-43da-8a55-a781828c8a01" containerName="collect-profiles" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.204738 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="6680368a-e3b2-43da-8a55-a781828c8a01" containerName="collect-profiles" Jan 31 09:30:10 crc kubenswrapper[4783]: E0131 09:30:10.204894 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b1076f-b620-47cc-8cbf-c70ecdbeab06" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.204906 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b1076f-b620-47cc-8cbf-c70ecdbeab06" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.205188 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b1076f-b620-47cc-8cbf-c70ecdbeab06" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.205213 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="6680368a-e3b2-43da-8a55-a781828c8a01" containerName="collect-profiles" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.206132 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.208175 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.208960 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4fpm\" (UniqueName: \"kubernetes.io/projected/e9ce688e-1f0e-486c-b3c7-4b45243713ed-kube-api-access-j4fpm\") pod \"ssh-known-hosts-edpm-deployment-bmbrc\" (UID: \"e9ce688e-1f0e-486c-b3c7-4b45243713ed\") " pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.209099 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e9ce688e-1f0e-486c-b3c7-4b45243713ed-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-bmbrc\" (UID: \"e9ce688e-1f0e-486c-b3c7-4b45243713ed\") " pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.209189 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9ce688e-1f0e-486c-b3c7-4b45243713ed-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-bmbrc\" (UID: \"e9ce688e-1f0e-486c-b3c7-4b45243713ed\") " pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.209716 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.209778 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.209956 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.213261 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-bmbrc"] Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.311339 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e9ce688e-1f0e-486c-b3c7-4b45243713ed-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-bmbrc\" (UID: \"e9ce688e-1f0e-486c-b3c7-4b45243713ed\") " pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.311383 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9ce688e-1f0e-486c-b3c7-4b45243713ed-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-bmbrc\" (UID: \"e9ce688e-1f0e-486c-b3c7-4b45243713ed\") " pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.311537 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4fpm\" (UniqueName: \"kubernetes.io/projected/e9ce688e-1f0e-486c-b3c7-4b45243713ed-kube-api-access-j4fpm\") pod \"ssh-known-hosts-edpm-deployment-bmbrc\" (UID: \"e9ce688e-1f0e-486c-b3c7-4b45243713ed\") " pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.317292 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e9ce688e-1f0e-486c-b3c7-4b45243713ed-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-bmbrc\" (UID: \"e9ce688e-1f0e-486c-b3c7-4b45243713ed\") " pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.317411 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9ce688e-1f0e-486c-b3c7-4b45243713ed-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-bmbrc\" (UID: \"e9ce688e-1f0e-486c-b3c7-4b45243713ed\") " pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.327521 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4fpm\" (UniqueName: \"kubernetes.io/projected/e9ce688e-1f0e-486c-b3c7-4b45243713ed-kube-api-access-j4fpm\") pod \"ssh-known-hosts-edpm-deployment-bmbrc\" (UID: \"e9ce688e-1f0e-486c-b3c7-4b45243713ed\") " pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.523179 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.702877 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.744358 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:30:10 crc kubenswrapper[4783]: I0131 09:30:10.980829 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-bmbrc"] Jan 31 09:30:11 crc kubenswrapper[4783]: I0131 09:30:11.141927 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" event={"ID":"e9ce688e-1f0e-486c-b3c7-4b45243713ed","Type":"ContainerStarted","Data":"cc9fcea4ce3e816b8a9b737c58bf51d4bdfcbd93d3e7c77f456b76b018f4c181"} Jan 31 09:30:11 crc kubenswrapper[4783]: I0131 09:30:11.142291 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n84cd" podUID="fb26c78d-8e92-4875-a3e4-d58dece2bb26" containerName="registry-server" containerID="cri-o://687aa0bc06c79d8b76eff20e0424a9218709bf19dcb547fb6ccb9bdfb83347b7" gracePeriod=2 Jan 31 09:30:11 crc kubenswrapper[4783]: I0131 09:30:11.183003 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gp896"] Jan 31 09:30:11 crc kubenswrapper[4783]: I0131 09:30:11.558354 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:30:11 crc kubenswrapper[4783]: I0131 09:30:11.748074 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb26c78d-8e92-4875-a3e4-d58dece2bb26-utilities\") pod \"fb26c78d-8e92-4875-a3e4-d58dece2bb26\" (UID: \"fb26c78d-8e92-4875-a3e4-d58dece2bb26\") " Jan 31 09:30:11 crc kubenswrapper[4783]: I0131 09:30:11.748212 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb26c78d-8e92-4875-a3e4-d58dece2bb26-catalog-content\") pod \"fb26c78d-8e92-4875-a3e4-d58dece2bb26\" (UID: \"fb26c78d-8e92-4875-a3e4-d58dece2bb26\") " Jan 31 09:30:11 crc kubenswrapper[4783]: I0131 09:30:11.748499 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6q49\" (UniqueName: \"kubernetes.io/projected/fb26c78d-8e92-4875-a3e4-d58dece2bb26-kube-api-access-q6q49\") pod \"fb26c78d-8e92-4875-a3e4-d58dece2bb26\" (UID: \"fb26c78d-8e92-4875-a3e4-d58dece2bb26\") " Jan 31 09:30:11 crc kubenswrapper[4783]: I0131 09:30:11.748881 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb26c78d-8e92-4875-a3e4-d58dece2bb26-utilities" (OuterVolumeSpecName: "utilities") pod "fb26c78d-8e92-4875-a3e4-d58dece2bb26" (UID: "fb26c78d-8e92-4875-a3e4-d58dece2bb26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:30:11 crc kubenswrapper[4783]: I0131 09:30:11.749328 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb26c78d-8e92-4875-a3e4-d58dece2bb26-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:11 crc kubenswrapper[4783]: I0131 09:30:11.751712 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb26c78d-8e92-4875-a3e4-d58dece2bb26-kube-api-access-q6q49" (OuterVolumeSpecName: "kube-api-access-q6q49") pod "fb26c78d-8e92-4875-a3e4-d58dece2bb26" (UID: "fb26c78d-8e92-4875-a3e4-d58dece2bb26"). InnerVolumeSpecName "kube-api-access-q6q49". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:30:11 crc kubenswrapper[4783]: I0131 09:30:11.764750 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb26c78d-8e92-4875-a3e4-d58dece2bb26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb26c78d-8e92-4875-a3e4-d58dece2bb26" (UID: "fb26c78d-8e92-4875-a3e4-d58dece2bb26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:30:11 crc kubenswrapper[4783]: I0131 09:30:11.851083 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb26c78d-8e92-4875-a3e4-d58dece2bb26-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:11 crc kubenswrapper[4783]: I0131 09:30:11.851189 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6q49\" (UniqueName: \"kubernetes.io/projected/fb26c78d-8e92-4875-a3e4-d58dece2bb26-kube-api-access-q6q49\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.150360 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" event={"ID":"e9ce688e-1f0e-486c-b3c7-4b45243713ed","Type":"ContainerStarted","Data":"43c459d3e1371a1fd9896dbdb5507584756be5a5ab424c196bcdffa955ed4860"} Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.152535 4783 generic.go:334] "Generic (PLEG): container finished" podID="fb26c78d-8e92-4875-a3e4-d58dece2bb26" containerID="687aa0bc06c79d8b76eff20e0424a9218709bf19dcb547fb6ccb9bdfb83347b7" exitCode=0 Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.152604 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n84cd" event={"ID":"fb26c78d-8e92-4875-a3e4-d58dece2bb26","Type":"ContainerDied","Data":"687aa0bc06c79d8b76eff20e0424a9218709bf19dcb547fb6ccb9bdfb83347b7"} Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.152653 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n84cd" event={"ID":"fb26c78d-8e92-4875-a3e4-d58dece2bb26","Type":"ContainerDied","Data":"aad202e565b8b1caeadda02246e69d79468e9f81d6a1af74c8ff5889e053b454"} Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.152674 4783 scope.go:117] "RemoveContainer" containerID="687aa0bc06c79d8b76eff20e0424a9218709bf19dcb547fb6ccb9bdfb83347b7" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.152617 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n84cd" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.152750 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gp896" podUID="e6b3ef10-20aa-4274-9fbe-277f97c80640" containerName="registry-server" containerID="cri-o://e5865c560300a237874970d6eb801203c409bf6a0f50eeb0c249f0146de3bafe" gracePeriod=2 Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.165303 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" podStartSLOduration=1.426475373 podStartE2EDuration="2.165291612s" podCreationTimestamp="2026-01-31 09:30:10 +0000 UTC" firstStartedPulling="2026-01-31 09:30:10.985596017 +0000 UTC m=+1521.654279485" lastFinishedPulling="2026-01-31 09:30:11.724412255 +0000 UTC m=+1522.393095724" observedRunningTime="2026-01-31 09:30:12.162072171 +0000 UTC m=+1522.830755640" watchObservedRunningTime="2026-01-31 09:30:12.165291612 +0000 UTC m=+1522.833975081" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.176506 4783 scope.go:117] "RemoveContainer" containerID="5392721d6f51f9796b195b5c629c93d3c8ee21434a9fdb31992fb83811c3928e" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.184212 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n84cd"] Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.188185 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n84cd"] Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.201107 4783 scope.go:117] "RemoveContainer" containerID="192546d7187326409be2344f67c5dd519625390d6999def1a1b6ba821fb9ec42" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.273323 4783 scope.go:117] "RemoveContainer" containerID="687aa0bc06c79d8b76eff20e0424a9218709bf19dcb547fb6ccb9bdfb83347b7" Jan 31 09:30:12 crc kubenswrapper[4783]: E0131 09:30:12.273755 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"687aa0bc06c79d8b76eff20e0424a9218709bf19dcb547fb6ccb9bdfb83347b7\": container with ID starting with 687aa0bc06c79d8b76eff20e0424a9218709bf19dcb547fb6ccb9bdfb83347b7 not found: ID does not exist" containerID="687aa0bc06c79d8b76eff20e0424a9218709bf19dcb547fb6ccb9bdfb83347b7" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.273804 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"687aa0bc06c79d8b76eff20e0424a9218709bf19dcb547fb6ccb9bdfb83347b7"} err="failed to get container status \"687aa0bc06c79d8b76eff20e0424a9218709bf19dcb547fb6ccb9bdfb83347b7\": rpc error: code = NotFound desc = could not find container \"687aa0bc06c79d8b76eff20e0424a9218709bf19dcb547fb6ccb9bdfb83347b7\": container with ID starting with 687aa0bc06c79d8b76eff20e0424a9218709bf19dcb547fb6ccb9bdfb83347b7 not found: ID does not exist" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.273844 4783 scope.go:117] "RemoveContainer" containerID="5392721d6f51f9796b195b5c629c93d3c8ee21434a9fdb31992fb83811c3928e" Jan 31 09:30:12 crc kubenswrapper[4783]: E0131 09:30:12.274273 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5392721d6f51f9796b195b5c629c93d3c8ee21434a9fdb31992fb83811c3928e\": container with ID starting with 5392721d6f51f9796b195b5c629c93d3c8ee21434a9fdb31992fb83811c3928e not found: ID does not exist" containerID="5392721d6f51f9796b195b5c629c93d3c8ee21434a9fdb31992fb83811c3928e" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.274320 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5392721d6f51f9796b195b5c629c93d3c8ee21434a9fdb31992fb83811c3928e"} err="failed to get container status \"5392721d6f51f9796b195b5c629c93d3c8ee21434a9fdb31992fb83811c3928e\": rpc error: code = NotFound desc = could not find container \"5392721d6f51f9796b195b5c629c93d3c8ee21434a9fdb31992fb83811c3928e\": container with ID starting with 5392721d6f51f9796b195b5c629c93d3c8ee21434a9fdb31992fb83811c3928e not found: ID does not exist" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.274356 4783 scope.go:117] "RemoveContainer" containerID="192546d7187326409be2344f67c5dd519625390d6999def1a1b6ba821fb9ec42" Jan 31 09:30:12 crc kubenswrapper[4783]: E0131 09:30:12.274727 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192546d7187326409be2344f67c5dd519625390d6999def1a1b6ba821fb9ec42\": container with ID starting with 192546d7187326409be2344f67c5dd519625390d6999def1a1b6ba821fb9ec42 not found: ID does not exist" containerID="192546d7187326409be2344f67c5dd519625390d6999def1a1b6ba821fb9ec42" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.274760 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192546d7187326409be2344f67c5dd519625390d6999def1a1b6ba821fb9ec42"} err="failed to get container status \"192546d7187326409be2344f67c5dd519625390d6999def1a1b6ba821fb9ec42\": rpc error: code = NotFound desc = could not find container \"192546d7187326409be2344f67c5dd519625390d6999def1a1b6ba821fb9ec42\": container with ID starting with 192546d7187326409be2344f67c5dd519625390d6999def1a1b6ba821fb9ec42 not found: ID does not exist" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.559915 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.570688 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6b3ef10-20aa-4274-9fbe-277f97c80640-catalog-content\") pod \"e6b3ef10-20aa-4274-9fbe-277f97c80640\" (UID: \"e6b3ef10-20aa-4274-9fbe-277f97c80640\") " Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.570746 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrmq7\" (UniqueName: \"kubernetes.io/projected/e6b3ef10-20aa-4274-9fbe-277f97c80640-kube-api-access-jrmq7\") pod \"e6b3ef10-20aa-4274-9fbe-277f97c80640\" (UID: \"e6b3ef10-20aa-4274-9fbe-277f97c80640\") " Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.570854 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6b3ef10-20aa-4274-9fbe-277f97c80640-utilities\") pod \"e6b3ef10-20aa-4274-9fbe-277f97c80640\" (UID: \"e6b3ef10-20aa-4274-9fbe-277f97c80640\") " Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.571774 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b3ef10-20aa-4274-9fbe-277f97c80640-utilities" (OuterVolumeSpecName: "utilities") pod "e6b3ef10-20aa-4274-9fbe-277f97c80640" (UID: "e6b3ef10-20aa-4274-9fbe-277f97c80640"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.576450 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b3ef10-20aa-4274-9fbe-277f97c80640-kube-api-access-jrmq7" (OuterVolumeSpecName: "kube-api-access-jrmq7") pod "e6b3ef10-20aa-4274-9fbe-277f97c80640" (UID: "e6b3ef10-20aa-4274-9fbe-277f97c80640"). InnerVolumeSpecName "kube-api-access-jrmq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.661271 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b3ef10-20aa-4274-9fbe-277f97c80640-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6b3ef10-20aa-4274-9fbe-277f97c80640" (UID: "e6b3ef10-20aa-4274-9fbe-277f97c80640"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.677003 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6b3ef10-20aa-4274-9fbe-277f97c80640-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.677309 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6b3ef10-20aa-4274-9fbe-277f97c80640-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:12 crc kubenswrapper[4783]: I0131 09:30:12.677327 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrmq7\" (UniqueName: \"kubernetes.io/projected/e6b3ef10-20aa-4274-9fbe-277f97c80640-kube-api-access-jrmq7\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.175107 4783 generic.go:334] "Generic (PLEG): container finished" podID="e6b3ef10-20aa-4274-9fbe-277f97c80640" containerID="e5865c560300a237874970d6eb801203c409bf6a0f50eeb0c249f0146de3bafe" exitCode=0 Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.177707 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp896" event={"ID":"e6b3ef10-20aa-4274-9fbe-277f97c80640","Type":"ContainerDied","Data":"e5865c560300a237874970d6eb801203c409bf6a0f50eeb0c249f0146de3bafe"} Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.177743 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp896" Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.177762 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp896" event={"ID":"e6b3ef10-20aa-4274-9fbe-277f97c80640","Type":"ContainerDied","Data":"be094fc382754e5f4765d3ba58cdef125f416011b3301272a5f3b0d14bb41d44"} Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.177797 4783 scope.go:117] "RemoveContainer" containerID="e5865c560300a237874970d6eb801203c409bf6a0f50eeb0c249f0146de3bafe" Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.217374 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gp896"] Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.218229 4783 scope.go:117] "RemoveContainer" containerID="9e4e0902accd24a3a914373500c814251f07ea2e8d8ad5ef62f99af356a74e86" Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.222709 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gp896"] Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.234079 4783 scope.go:117] "RemoveContainer" containerID="c4d36be0cdb903087caa213a84f92af5315fc9c1d22d7522d6f0191ad62bd522" Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.267924 4783 scope.go:117] "RemoveContainer" containerID="e5865c560300a237874970d6eb801203c409bf6a0f50eeb0c249f0146de3bafe" Jan 31 09:30:13 crc kubenswrapper[4783]: E0131 09:30:13.268291 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5865c560300a237874970d6eb801203c409bf6a0f50eeb0c249f0146de3bafe\": container with ID starting with e5865c560300a237874970d6eb801203c409bf6a0f50eeb0c249f0146de3bafe not found: ID does not exist" containerID="e5865c560300a237874970d6eb801203c409bf6a0f50eeb0c249f0146de3bafe" Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.268345 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5865c560300a237874970d6eb801203c409bf6a0f50eeb0c249f0146de3bafe"} err="failed to get container status \"e5865c560300a237874970d6eb801203c409bf6a0f50eeb0c249f0146de3bafe\": rpc error: code = NotFound desc = could not find container \"e5865c560300a237874970d6eb801203c409bf6a0f50eeb0c249f0146de3bafe\": container with ID starting with e5865c560300a237874970d6eb801203c409bf6a0f50eeb0c249f0146de3bafe not found: ID does not exist" Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.268378 4783 scope.go:117] "RemoveContainer" containerID="9e4e0902accd24a3a914373500c814251f07ea2e8d8ad5ef62f99af356a74e86" Jan 31 09:30:13 crc kubenswrapper[4783]: E0131 09:30:13.268732 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e4e0902accd24a3a914373500c814251f07ea2e8d8ad5ef62f99af356a74e86\": container with ID starting with 9e4e0902accd24a3a914373500c814251f07ea2e8d8ad5ef62f99af356a74e86 not found: ID does not exist" containerID="9e4e0902accd24a3a914373500c814251f07ea2e8d8ad5ef62f99af356a74e86" Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.268765 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e4e0902accd24a3a914373500c814251f07ea2e8d8ad5ef62f99af356a74e86"} err="failed to get container status \"9e4e0902accd24a3a914373500c814251f07ea2e8d8ad5ef62f99af356a74e86\": rpc error: code = NotFound desc = could not find container \"9e4e0902accd24a3a914373500c814251f07ea2e8d8ad5ef62f99af356a74e86\": container with ID starting with 9e4e0902accd24a3a914373500c814251f07ea2e8d8ad5ef62f99af356a74e86 not found: ID does not exist" Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.268790 4783 scope.go:117] "RemoveContainer" containerID="c4d36be0cdb903087caa213a84f92af5315fc9c1d22d7522d6f0191ad62bd522" Jan 31 09:30:13 crc kubenswrapper[4783]: E0131 09:30:13.269144 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4d36be0cdb903087caa213a84f92af5315fc9c1d22d7522d6f0191ad62bd522\": container with ID starting with c4d36be0cdb903087caa213a84f92af5315fc9c1d22d7522d6f0191ad62bd522 not found: ID does not exist" containerID="c4d36be0cdb903087caa213a84f92af5315fc9c1d22d7522d6f0191ad62bd522" Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.269199 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d36be0cdb903087caa213a84f92af5315fc9c1d22d7522d6f0191ad62bd522"} err="failed to get container status \"c4d36be0cdb903087caa213a84f92af5315fc9c1d22d7522d6f0191ad62bd522\": rpc error: code = NotFound desc = could not find container \"c4d36be0cdb903087caa213a84f92af5315fc9c1d22d7522d6f0191ad62bd522\": container with ID starting with c4d36be0cdb903087caa213a84f92af5315fc9c1d22d7522d6f0191ad62bd522 not found: ID does not exist" Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.653495 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b3ef10-20aa-4274-9fbe-277f97c80640" path="/var/lib/kubelet/pods/e6b3ef10-20aa-4274-9fbe-277f97c80640/volumes" Jan 31 09:30:13 crc kubenswrapper[4783]: I0131 09:30:13.654389 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb26c78d-8e92-4875-a3e4-d58dece2bb26" path="/var/lib/kubelet/pods/fb26c78d-8e92-4875-a3e4-d58dece2bb26/volumes" Jan 31 09:30:17 crc kubenswrapper[4783]: I0131 09:30:17.211143 4783 generic.go:334] "Generic (PLEG): container finished" podID="e9ce688e-1f0e-486c-b3c7-4b45243713ed" containerID="43c459d3e1371a1fd9896dbdb5507584756be5a5ab424c196bcdffa955ed4860" exitCode=0 Jan 31 09:30:17 crc kubenswrapper[4783]: I0131 09:30:17.211207 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" event={"ID":"e9ce688e-1f0e-486c-b3c7-4b45243713ed","Type":"ContainerDied","Data":"43c459d3e1371a1fd9896dbdb5507584756be5a5ab424c196bcdffa955ed4860"} Jan 31 09:30:17 crc kubenswrapper[4783]: I0131 09:30:17.756205 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:30:17 crc kubenswrapper[4783]: I0131 09:30:17.756276 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:30:18 crc kubenswrapper[4783]: I0131 09:30:18.539647 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" Jan 31 09:30:18 crc kubenswrapper[4783]: I0131 09:30:18.707528 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4fpm\" (UniqueName: \"kubernetes.io/projected/e9ce688e-1f0e-486c-b3c7-4b45243713ed-kube-api-access-j4fpm\") pod \"e9ce688e-1f0e-486c-b3c7-4b45243713ed\" (UID: \"e9ce688e-1f0e-486c-b3c7-4b45243713ed\") " Jan 31 09:30:18 crc kubenswrapper[4783]: I0131 09:30:18.708564 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9ce688e-1f0e-486c-b3c7-4b45243713ed-ssh-key-openstack-edpm-ipam\") pod \"e9ce688e-1f0e-486c-b3c7-4b45243713ed\" (UID: \"e9ce688e-1f0e-486c-b3c7-4b45243713ed\") " Jan 31 09:30:18 crc kubenswrapper[4783]: I0131 09:30:18.708707 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e9ce688e-1f0e-486c-b3c7-4b45243713ed-inventory-0\") pod \"e9ce688e-1f0e-486c-b3c7-4b45243713ed\" (UID: \"e9ce688e-1f0e-486c-b3c7-4b45243713ed\") " Jan 31 09:30:18 crc kubenswrapper[4783]: I0131 09:30:18.713600 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ce688e-1f0e-486c-b3c7-4b45243713ed-kube-api-access-j4fpm" (OuterVolumeSpecName: "kube-api-access-j4fpm") pod "e9ce688e-1f0e-486c-b3c7-4b45243713ed" (UID: "e9ce688e-1f0e-486c-b3c7-4b45243713ed"). InnerVolumeSpecName "kube-api-access-j4fpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:30:18 crc kubenswrapper[4783]: I0131 09:30:18.732446 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ce688e-1f0e-486c-b3c7-4b45243713ed-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "e9ce688e-1f0e-486c-b3c7-4b45243713ed" (UID: "e9ce688e-1f0e-486c-b3c7-4b45243713ed"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:18 crc kubenswrapper[4783]: I0131 09:30:18.733850 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ce688e-1f0e-486c-b3c7-4b45243713ed-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e9ce688e-1f0e-486c-b3c7-4b45243713ed" (UID: "e9ce688e-1f0e-486c-b3c7-4b45243713ed"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:18 crc kubenswrapper[4783]: I0131 09:30:18.811669 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4fpm\" (UniqueName: \"kubernetes.io/projected/e9ce688e-1f0e-486c-b3c7-4b45243713ed-kube-api-access-j4fpm\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:18 crc kubenswrapper[4783]: I0131 09:30:18.811697 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9ce688e-1f0e-486c-b3c7-4b45243713ed-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:18 crc kubenswrapper[4783]: I0131 09:30:18.811707 4783 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e9ce688e-1f0e-486c-b3c7-4b45243713ed-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.041332 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cwssw"] Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.049266 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cwssw"] Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.234282 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" event={"ID":"e9ce688e-1f0e-486c-b3c7-4b45243713ed","Type":"ContainerDied","Data":"cc9fcea4ce3e816b8a9b737c58bf51d4bdfcbd93d3e7c77f456b76b018f4c181"} Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.234341 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc9fcea4ce3e816b8a9b737c58bf51d4bdfcbd93d3e7c77f456b76b018f4c181" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.234462 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-bmbrc" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.296494 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2"] Jan 31 09:30:19 crc kubenswrapper[4783]: E0131 09:30:19.296854 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b3ef10-20aa-4274-9fbe-277f97c80640" containerName="registry-server" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.296872 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b3ef10-20aa-4274-9fbe-277f97c80640" containerName="registry-server" Jan 31 09:30:19 crc kubenswrapper[4783]: E0131 09:30:19.296888 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb26c78d-8e92-4875-a3e4-d58dece2bb26" containerName="registry-server" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.296897 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb26c78d-8e92-4875-a3e4-d58dece2bb26" containerName="registry-server" Jan 31 09:30:19 crc kubenswrapper[4783]: E0131 09:30:19.296908 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b3ef10-20aa-4274-9fbe-277f97c80640" containerName="extract-utilities" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.296914 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b3ef10-20aa-4274-9fbe-277f97c80640" containerName="extract-utilities" Jan 31 09:30:19 crc kubenswrapper[4783]: E0131 09:30:19.296931 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb26c78d-8e92-4875-a3e4-d58dece2bb26" containerName="extract-utilities" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.296937 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb26c78d-8e92-4875-a3e4-d58dece2bb26" containerName="extract-utilities" Jan 31 09:30:19 crc kubenswrapper[4783]: E0131 09:30:19.296953 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b3ef10-20aa-4274-9fbe-277f97c80640" containerName="extract-content" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.296959 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b3ef10-20aa-4274-9fbe-277f97c80640" containerName="extract-content" Jan 31 09:30:19 crc kubenswrapper[4783]: E0131 09:30:19.296973 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb26c78d-8e92-4875-a3e4-d58dece2bb26" containerName="extract-content" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.296978 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb26c78d-8e92-4875-a3e4-d58dece2bb26" containerName="extract-content" Jan 31 09:30:19 crc kubenswrapper[4783]: E0131 09:30:19.296986 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ce688e-1f0e-486c-b3c7-4b45243713ed" containerName="ssh-known-hosts-edpm-deployment" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.296992 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ce688e-1f0e-486c-b3c7-4b45243713ed" containerName="ssh-known-hosts-edpm-deployment" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.297152 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb26c78d-8e92-4875-a3e4-d58dece2bb26" containerName="registry-server" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.297178 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ce688e-1f0e-486c-b3c7-4b45243713ed" containerName="ssh-known-hosts-edpm-deployment" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.297192 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b3ef10-20aa-4274-9fbe-277f97c80640" containerName="registry-server" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.297761 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.302405 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.302417 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.302537 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.302725 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.306314 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2"] Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.319449 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clgk6\" (UniqueName: \"kubernetes.io/projected/35e3815e-af8f-4724-846b-ea6038002f70-kube-api-access-clgk6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mpq2\" (UID: \"35e3815e-af8f-4724-846b-ea6038002f70\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.319525 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35e3815e-af8f-4724-846b-ea6038002f70-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mpq2\" (UID: \"35e3815e-af8f-4724-846b-ea6038002f70\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.319649 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35e3815e-af8f-4724-846b-ea6038002f70-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mpq2\" (UID: \"35e3815e-af8f-4724-846b-ea6038002f70\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.421884 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clgk6\" (UniqueName: \"kubernetes.io/projected/35e3815e-af8f-4724-846b-ea6038002f70-kube-api-access-clgk6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mpq2\" (UID: \"35e3815e-af8f-4724-846b-ea6038002f70\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.421971 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35e3815e-af8f-4724-846b-ea6038002f70-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mpq2\" (UID: \"35e3815e-af8f-4724-846b-ea6038002f70\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.422039 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35e3815e-af8f-4724-846b-ea6038002f70-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mpq2\" (UID: \"35e3815e-af8f-4724-846b-ea6038002f70\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.426084 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35e3815e-af8f-4724-846b-ea6038002f70-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mpq2\" (UID: \"35e3815e-af8f-4724-846b-ea6038002f70\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.426552 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35e3815e-af8f-4724-846b-ea6038002f70-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mpq2\" (UID: \"35e3815e-af8f-4724-846b-ea6038002f70\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.439074 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clgk6\" (UniqueName: \"kubernetes.io/projected/35e3815e-af8f-4724-846b-ea6038002f70-kube-api-access-clgk6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mpq2\" (UID: \"35e3815e-af8f-4724-846b-ea6038002f70\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.618938 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" Jan 31 09:30:19 crc kubenswrapper[4783]: I0131 09:30:19.689075 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32d39e3f-c526-4ce8-8ca2-f7fa369be6ef" path="/var/lib/kubelet/pods/32d39e3f-c526-4ce8-8ca2-f7fa369be6ef/volumes" Jan 31 09:30:20 crc kubenswrapper[4783]: I0131 09:30:20.103863 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2"] Jan 31 09:30:20 crc kubenswrapper[4783]: I0131 09:30:20.244027 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" event={"ID":"35e3815e-af8f-4724-846b-ea6038002f70","Type":"ContainerStarted","Data":"e52f2a1389575ca3fc2ddc487b3576a3ec1a348272300f65358935f0e30c5599"} Jan 31 09:30:21 crc kubenswrapper[4783]: I0131 09:30:21.257006 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" event={"ID":"35e3815e-af8f-4724-846b-ea6038002f70","Type":"ContainerStarted","Data":"d69cbf7c771fa6db7293607d5dc2f0bc772b3368facb80edf450ec751371a929"} Jan 31 09:30:21 crc kubenswrapper[4783]: I0131 09:30:21.275289 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" podStartSLOduration=1.791886533 podStartE2EDuration="2.275263293s" podCreationTimestamp="2026-01-31 09:30:19 +0000 UTC" firstStartedPulling="2026-01-31 09:30:20.111609691 +0000 UTC m=+1530.780293159" lastFinishedPulling="2026-01-31 09:30:20.594986451 +0000 UTC m=+1531.263669919" observedRunningTime="2026-01-31 09:30:21.268539804 +0000 UTC m=+1531.937223272" watchObservedRunningTime="2026-01-31 09:30:21.275263293 +0000 UTC m=+1531.943946761" Jan 31 09:30:27 crc kubenswrapper[4783]: I0131 09:30:27.303366 4783 generic.go:334] "Generic (PLEG): container finished" podID="35e3815e-af8f-4724-846b-ea6038002f70" containerID="d69cbf7c771fa6db7293607d5dc2f0bc772b3368facb80edf450ec751371a929" exitCode=0 Jan 31 09:30:27 crc kubenswrapper[4783]: I0131 09:30:27.303447 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" event={"ID":"35e3815e-af8f-4724-846b-ea6038002f70","Type":"ContainerDied","Data":"d69cbf7c771fa6db7293607d5dc2f0bc772b3368facb80edf450ec751371a929"} Jan 31 09:30:28 crc kubenswrapper[4783]: I0131 09:30:28.690011 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" Jan 31 09:30:28 crc kubenswrapper[4783]: I0131 09:30:28.822887 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35e3815e-af8f-4724-846b-ea6038002f70-ssh-key-openstack-edpm-ipam\") pod \"35e3815e-af8f-4724-846b-ea6038002f70\" (UID: \"35e3815e-af8f-4724-846b-ea6038002f70\") " Jan 31 09:30:28 crc kubenswrapper[4783]: I0131 09:30:28.822937 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clgk6\" (UniqueName: \"kubernetes.io/projected/35e3815e-af8f-4724-846b-ea6038002f70-kube-api-access-clgk6\") pod \"35e3815e-af8f-4724-846b-ea6038002f70\" (UID: \"35e3815e-af8f-4724-846b-ea6038002f70\") " Jan 31 09:30:28 crc kubenswrapper[4783]: I0131 09:30:28.823106 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35e3815e-af8f-4724-846b-ea6038002f70-inventory\") pod \"35e3815e-af8f-4724-846b-ea6038002f70\" (UID: \"35e3815e-af8f-4724-846b-ea6038002f70\") " Jan 31 09:30:28 crc kubenswrapper[4783]: I0131 09:30:28.828697 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e3815e-af8f-4724-846b-ea6038002f70-kube-api-access-clgk6" (OuterVolumeSpecName: "kube-api-access-clgk6") pod "35e3815e-af8f-4724-846b-ea6038002f70" (UID: "35e3815e-af8f-4724-846b-ea6038002f70"). InnerVolumeSpecName "kube-api-access-clgk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:30:28 crc kubenswrapper[4783]: I0131 09:30:28.849231 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e3815e-af8f-4724-846b-ea6038002f70-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "35e3815e-af8f-4724-846b-ea6038002f70" (UID: "35e3815e-af8f-4724-846b-ea6038002f70"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:28 crc kubenswrapper[4783]: I0131 09:30:28.851383 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e3815e-af8f-4724-846b-ea6038002f70-inventory" (OuterVolumeSpecName: "inventory") pod "35e3815e-af8f-4724-846b-ea6038002f70" (UID: "35e3815e-af8f-4724-846b-ea6038002f70"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:28 crc kubenswrapper[4783]: I0131 09:30:28.931876 4783 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35e3815e-af8f-4724-846b-ea6038002f70-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:28 crc kubenswrapper[4783]: I0131 09:30:28.931945 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35e3815e-af8f-4724-846b-ea6038002f70-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:28 crc kubenswrapper[4783]: I0131 09:30:28.931970 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clgk6\" (UniqueName: \"kubernetes.io/projected/35e3815e-af8f-4724-846b-ea6038002f70-kube-api-access-clgk6\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.322830 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" event={"ID":"35e3815e-af8f-4724-846b-ea6038002f70","Type":"ContainerDied","Data":"e52f2a1389575ca3fc2ddc487b3576a3ec1a348272300f65358935f0e30c5599"} Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.323188 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e52f2a1389575ca3fc2ddc487b3576a3ec1a348272300f65358935f0e30c5599" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.322895 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mpq2" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.405693 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb"] Jan 31 09:30:29 crc kubenswrapper[4783]: E0131 09:30:29.406119 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e3815e-af8f-4724-846b-ea6038002f70" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.406133 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e3815e-af8f-4724-846b-ea6038002f70" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.406317 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e3815e-af8f-4724-846b-ea6038002f70" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.406921 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.409327 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.409557 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.409684 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.409777 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.412744 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb"] Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.442537 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb\" (UID: \"cf6bea6a-6877-4624-b8ac-cfd51fb514a9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.442597 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb\" (UID: \"cf6bea6a-6877-4624-b8ac-cfd51fb514a9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.443101 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8n4r\" (UniqueName: \"kubernetes.io/projected/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-kube-api-access-q8n4r\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb\" (UID: \"cf6bea6a-6877-4624-b8ac-cfd51fb514a9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.545067 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8n4r\" (UniqueName: \"kubernetes.io/projected/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-kube-api-access-q8n4r\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb\" (UID: \"cf6bea6a-6877-4624-b8ac-cfd51fb514a9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.545178 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb\" (UID: \"cf6bea6a-6877-4624-b8ac-cfd51fb514a9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.545215 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb\" (UID: \"cf6bea6a-6877-4624-b8ac-cfd51fb514a9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.550659 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb\" (UID: \"cf6bea6a-6877-4624-b8ac-cfd51fb514a9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.551305 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb\" (UID: \"cf6bea6a-6877-4624-b8ac-cfd51fb514a9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.566879 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8n4r\" (UniqueName: \"kubernetes.io/projected/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-kube-api-access-q8n4r\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb\" (UID: \"cf6bea6a-6877-4624-b8ac-cfd51fb514a9\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" Jan 31 09:30:29 crc kubenswrapper[4783]: I0131 09:30:29.724491 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" Jan 31 09:30:30 crc kubenswrapper[4783]: I0131 09:30:30.270777 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb"] Jan 31 09:30:30 crc kubenswrapper[4783]: I0131 09:30:30.331227 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" event={"ID":"cf6bea6a-6877-4624-b8ac-cfd51fb514a9","Type":"ContainerStarted","Data":"7894678b17fdacb96ddf5242f89a15f3f2db81426d33ff44d3abf8b7713d657c"} Jan 31 09:30:31 crc kubenswrapper[4783]: I0131 09:30:31.342689 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" event={"ID":"cf6bea6a-6877-4624-b8ac-cfd51fb514a9","Type":"ContainerStarted","Data":"a67a9819559ac5d0dbbb7439da807b82f634c81892e83cc2a5a57649b43d8e6f"} Jan 31 09:30:31 crc kubenswrapper[4783]: I0131 09:30:31.361301 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" podStartSLOduration=1.802885135 podStartE2EDuration="2.361284416s" podCreationTimestamp="2026-01-31 09:30:29 +0000 UTC" firstStartedPulling="2026-01-31 09:30:30.273898682 +0000 UTC m=+1540.942582150" lastFinishedPulling="2026-01-31 09:30:30.832297963 +0000 UTC m=+1541.500981431" observedRunningTime="2026-01-31 09:30:31.355119099 +0000 UTC m=+1542.023802567" watchObservedRunningTime="2026-01-31 09:30:31.361284416 +0000 UTC m=+1542.029967884" Jan 31 09:30:38 crc kubenswrapper[4783]: I0131 09:30:38.039061 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-h95z6"] Jan 31 09:30:38 crc kubenswrapper[4783]: I0131 09:30:38.044579 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-h95z6"] Jan 31 09:30:38 crc kubenswrapper[4783]: I0131 09:30:38.409190 4783 generic.go:334] "Generic (PLEG): container finished" podID="cf6bea6a-6877-4624-b8ac-cfd51fb514a9" containerID="a67a9819559ac5d0dbbb7439da807b82f634c81892e83cc2a5a57649b43d8e6f" exitCode=0 Jan 31 09:30:38 crc kubenswrapper[4783]: I0131 09:30:38.409237 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" event={"ID":"cf6bea6a-6877-4624-b8ac-cfd51fb514a9","Type":"ContainerDied","Data":"a67a9819559ac5d0dbbb7439da807b82f634c81892e83cc2a5a57649b43d8e6f"} Jan 31 09:30:39 crc kubenswrapper[4783]: I0131 09:30:39.032348 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-sc4zh"] Jan 31 09:30:39 crc kubenswrapper[4783]: I0131 09:30:39.041190 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-sc4zh"] Jan 31 09:30:39 crc kubenswrapper[4783]: I0131 09:30:39.655513 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="081ac503-fb08-4990-9019-be9c35167de3" path="/var/lib/kubelet/pods/081ac503-fb08-4990-9019-be9c35167de3/volumes" Jan 31 09:30:39 crc kubenswrapper[4783]: I0131 09:30:39.656692 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fbba53b-4460-44f1-9745-1c627089c168" path="/var/lib/kubelet/pods/7fbba53b-4460-44f1-9745-1c627089c168/volumes" Jan 31 09:30:39 crc kubenswrapper[4783]: I0131 09:30:39.807882 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" Jan 31 09:30:39 crc kubenswrapper[4783]: I0131 09:30:39.972021 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8n4r\" (UniqueName: \"kubernetes.io/projected/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-kube-api-access-q8n4r\") pod \"cf6bea6a-6877-4624-b8ac-cfd51fb514a9\" (UID: \"cf6bea6a-6877-4624-b8ac-cfd51fb514a9\") " Jan 31 09:30:39 crc kubenswrapper[4783]: I0131 09:30:39.972140 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-inventory\") pod \"cf6bea6a-6877-4624-b8ac-cfd51fb514a9\" (UID: \"cf6bea6a-6877-4624-b8ac-cfd51fb514a9\") " Jan 31 09:30:39 crc kubenswrapper[4783]: I0131 09:30:39.972398 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-ssh-key-openstack-edpm-ipam\") pod \"cf6bea6a-6877-4624-b8ac-cfd51fb514a9\" (UID: \"cf6bea6a-6877-4624-b8ac-cfd51fb514a9\") " Jan 31 09:30:39 crc kubenswrapper[4783]: I0131 09:30:39.978339 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-kube-api-access-q8n4r" (OuterVolumeSpecName: "kube-api-access-q8n4r") pod "cf6bea6a-6877-4624-b8ac-cfd51fb514a9" (UID: "cf6bea6a-6877-4624-b8ac-cfd51fb514a9"). InnerVolumeSpecName "kube-api-access-q8n4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:30:39 crc kubenswrapper[4783]: I0131 09:30:39.995289 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cf6bea6a-6877-4624-b8ac-cfd51fb514a9" (UID: "cf6bea6a-6877-4624-b8ac-cfd51fb514a9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:39 crc kubenswrapper[4783]: I0131 09:30:39.996936 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-inventory" (OuterVolumeSpecName: "inventory") pod "cf6bea6a-6877-4624-b8ac-cfd51fb514a9" (UID: "cf6bea6a-6877-4624-b8ac-cfd51fb514a9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.074675 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8n4r\" (UniqueName: \"kubernetes.io/projected/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-kube-api-access-q8n4r\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.074702 4783 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.074712 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf6bea6a-6877-4624-b8ac-cfd51fb514a9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.424973 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" event={"ID":"cf6bea6a-6877-4624-b8ac-cfd51fb514a9","Type":"ContainerDied","Data":"7894678b17fdacb96ddf5242f89a15f3f2db81426d33ff44d3abf8b7713d657c"} Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.425316 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7894678b17fdacb96ddf5242f89a15f3f2db81426d33ff44d3abf8b7713d657c" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.425031 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.506384 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk"] Jan 31 09:30:40 crc kubenswrapper[4783]: E0131 09:30:40.506830 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6bea6a-6877-4624-b8ac-cfd51fb514a9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.506850 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6bea6a-6877-4624-b8ac-cfd51fb514a9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.507010 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf6bea6a-6877-4624-b8ac-cfd51fb514a9" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.507792 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.512917 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.513000 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.514103 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.514303 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.514420 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.514641 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.514802 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.514932 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.521777 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk"] Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.585435 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.585479 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.585503 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cj77\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-kube-api-access-5cj77\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.585528 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.585759 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.585811 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.585892 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.585930 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.586028 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.586098 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.586142 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.586340 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.586385 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.586477 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.688303 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.688354 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.688393 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.688434 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.689117 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.689180 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cj77\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-kube-api-access-5cj77\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.689217 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.689275 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.689307 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.689344 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.689383 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.689424 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.689465 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.689492 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.699804 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.701590 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.701618 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.703200 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.704371 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.704393 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.706948 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.706958 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.706993 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.707014 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.707217 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.707890 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.708695 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.709288 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cj77\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-kube-api-access-5cj77\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:40 crc kubenswrapper[4783]: I0131 09:30:40.822795 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:30:41 crc kubenswrapper[4783]: I0131 09:30:41.322228 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk"] Jan 31 09:30:41 crc kubenswrapper[4783]: I0131 09:30:41.434577 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" event={"ID":"b59b9070-af55-4915-bae2-414ca2aab1b7","Type":"ContainerStarted","Data":"9336d7ebe0e65a08f52d405c3f3f43e2ee7206c4565ac8f72d798ef5eb8dc887"} Jan 31 09:30:42 crc kubenswrapper[4783]: I0131 09:30:42.448263 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" event={"ID":"b59b9070-af55-4915-bae2-414ca2aab1b7","Type":"ContainerStarted","Data":"1656f8f3c9bb74a14e0dbdcf7ed2aa6df977498adbed69bbe1297ec3d65f2c28"} Jan 31 09:30:42 crc kubenswrapper[4783]: I0131 09:30:42.468639 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" podStartSLOduration=1.9663987349999998 podStartE2EDuration="2.468611997s" podCreationTimestamp="2026-01-31 09:30:40 +0000 UTC" firstStartedPulling="2026-01-31 09:30:41.322768813 +0000 UTC m=+1551.991452281" lastFinishedPulling="2026-01-31 09:30:41.824982075 +0000 UTC m=+1552.493665543" observedRunningTime="2026-01-31 09:30:42.46554354 +0000 UTC m=+1553.134227008" watchObservedRunningTime="2026-01-31 09:30:42.468611997 +0000 UTC m=+1553.137295466" Jan 31 09:30:47 crc kubenswrapper[4783]: I0131 09:30:47.756239 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:30:47 crc kubenswrapper[4783]: I0131 09:30:47.756938 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:31:00 crc kubenswrapper[4783]: I0131 09:31:00.565841 4783 scope.go:117] "RemoveContainer" containerID="fbd9072a8d3842a95774cac15afdcedf7d6510f10e98405f7df9cb200014801e" Jan 31 09:31:00 crc kubenswrapper[4783]: I0131 09:31:00.602904 4783 scope.go:117] "RemoveContainer" containerID="fb61b6f9b06124081a24706bc08481bfbf889a8936a74032b6cafee05c43aa65" Jan 31 09:31:00 crc kubenswrapper[4783]: I0131 09:31:00.645344 4783 scope.go:117] "RemoveContainer" containerID="c3d917f33cf7e46bcc19a946b53c372a9598023906e4501825aed919076c9be7" Jan 31 09:31:08 crc kubenswrapper[4783]: I0131 09:31:08.686130 4783 generic.go:334] "Generic (PLEG): container finished" podID="b59b9070-af55-4915-bae2-414ca2aab1b7" containerID="1656f8f3c9bb74a14e0dbdcf7ed2aa6df977498adbed69bbe1297ec3d65f2c28" exitCode=0 Jan 31 09:31:08 crc kubenswrapper[4783]: I0131 09:31:08.686214 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" event={"ID":"b59b9070-af55-4915-bae2-414ca2aab1b7","Type":"ContainerDied","Data":"1656f8f3c9bb74a14e0dbdcf7ed2aa6df977498adbed69bbe1297ec3d65f2c28"} Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.060221 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.098130 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-neutron-metadata-combined-ca-bundle\") pod \"b59b9070-af55-4915-bae2-414ca2aab1b7\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.105242 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "b59b9070-af55-4915-bae2-414ca2aab1b7" (UID: "b59b9070-af55-4915-bae2-414ca2aab1b7"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.199718 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"b59b9070-af55-4915-bae2-414ca2aab1b7\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.199816 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-ssh-key-openstack-edpm-ipam\") pod \"b59b9070-af55-4915-bae2-414ca2aab1b7\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.199842 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-ovn-default-certs-0\") pod \"b59b9070-af55-4915-bae2-414ca2aab1b7\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.199910 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cj77\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-kube-api-access-5cj77\") pod \"b59b9070-af55-4915-bae2-414ca2aab1b7\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.200050 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-bootstrap-combined-ca-bundle\") pod \"b59b9070-af55-4915-bae2-414ca2aab1b7\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.200121 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-repo-setup-combined-ca-bundle\") pod \"b59b9070-af55-4915-bae2-414ca2aab1b7\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.200212 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"b59b9070-af55-4915-bae2-414ca2aab1b7\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.200264 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-ovn-combined-ca-bundle\") pod \"b59b9070-af55-4915-bae2-414ca2aab1b7\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.200289 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"b59b9070-af55-4915-bae2-414ca2aab1b7\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.200319 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-inventory\") pod \"b59b9070-af55-4915-bae2-414ca2aab1b7\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.200349 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-nova-combined-ca-bundle\") pod \"b59b9070-af55-4915-bae2-414ca2aab1b7\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.200439 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-libvirt-combined-ca-bundle\") pod \"b59b9070-af55-4915-bae2-414ca2aab1b7\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.200475 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-telemetry-combined-ca-bundle\") pod \"b59b9070-af55-4915-bae2-414ca2aab1b7\" (UID: \"b59b9070-af55-4915-bae2-414ca2aab1b7\") " Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.201275 4783 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.204927 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "b59b9070-af55-4915-bae2-414ca2aab1b7" (UID: "b59b9070-af55-4915-bae2-414ca2aab1b7"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.206583 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "b59b9070-af55-4915-bae2-414ca2aab1b7" (UID: "b59b9070-af55-4915-bae2-414ca2aab1b7"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.207127 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "b59b9070-af55-4915-bae2-414ca2aab1b7" (UID: "b59b9070-af55-4915-bae2-414ca2aab1b7"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.207476 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "b59b9070-af55-4915-bae2-414ca2aab1b7" (UID: "b59b9070-af55-4915-bae2-414ca2aab1b7"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.207540 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-kube-api-access-5cj77" (OuterVolumeSpecName: "kube-api-access-5cj77") pod "b59b9070-af55-4915-bae2-414ca2aab1b7" (UID: "b59b9070-af55-4915-bae2-414ca2aab1b7"). InnerVolumeSpecName "kube-api-access-5cj77". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.207640 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "b59b9070-af55-4915-bae2-414ca2aab1b7" (UID: "b59b9070-af55-4915-bae2-414ca2aab1b7"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.207746 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "b59b9070-af55-4915-bae2-414ca2aab1b7" (UID: "b59b9070-af55-4915-bae2-414ca2aab1b7"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.208346 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b59b9070-af55-4915-bae2-414ca2aab1b7" (UID: "b59b9070-af55-4915-bae2-414ca2aab1b7"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.209292 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "b59b9070-af55-4915-bae2-414ca2aab1b7" (UID: "b59b9070-af55-4915-bae2-414ca2aab1b7"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.211793 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b59b9070-af55-4915-bae2-414ca2aab1b7" (UID: "b59b9070-af55-4915-bae2-414ca2aab1b7"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.211954 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "b59b9070-af55-4915-bae2-414ca2aab1b7" (UID: "b59b9070-af55-4915-bae2-414ca2aab1b7"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.227184 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b59b9070-af55-4915-bae2-414ca2aab1b7" (UID: "b59b9070-af55-4915-bae2-414ca2aab1b7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.228310 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-inventory" (OuterVolumeSpecName: "inventory") pod "b59b9070-af55-4915-bae2-414ca2aab1b7" (UID: "b59b9070-af55-4915-bae2-414ca2aab1b7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.303321 4783 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.303356 4783 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.303371 4783 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.303385 4783 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.303398 4783 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.303410 4783 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.303424 4783 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.303434 4783 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.303444 4783 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.303454 4783 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.303466 4783 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.303476 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b59b9070-af55-4915-bae2-414ca2aab1b7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.303486 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cj77\" (UniqueName: \"kubernetes.io/projected/b59b9070-af55-4915-bae2-414ca2aab1b7-kube-api-access-5cj77\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.703777 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" event={"ID":"b59b9070-af55-4915-bae2-414ca2aab1b7","Type":"ContainerDied","Data":"9336d7ebe0e65a08f52d405c3f3f43e2ee7206c4565ac8f72d798ef5eb8dc887"} Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.703828 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9336d7ebe0e65a08f52d405c3f3f43e2ee7206c4565ac8f72d798ef5eb8dc887" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.703831 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.852744 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb"] Jan 31 09:31:10 crc kubenswrapper[4783]: E0131 09:31:10.853233 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b59b9070-af55-4915-bae2-414ca2aab1b7" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.853257 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="b59b9070-af55-4915-bae2-414ca2aab1b7" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.853446 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="b59b9070-af55-4915-bae2-414ca2aab1b7" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.854157 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.856916 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.857017 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.857068 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.856939 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.857014 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.862754 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb"] Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.916618 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zp2tb\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.917087 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zp2tb\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.917349 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zp2tb\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.917392 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zp2tb\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:10 crc kubenswrapper[4783]: I0131 09:31:10.917423 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7srr\" (UniqueName: \"kubernetes.io/projected/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-kube-api-access-q7srr\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zp2tb\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:11 crc kubenswrapper[4783]: I0131 09:31:11.019703 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zp2tb\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:11 crc kubenswrapper[4783]: I0131 09:31:11.019770 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zp2tb\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:11 crc kubenswrapper[4783]: I0131 09:31:11.019858 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zp2tb\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:11 crc kubenswrapper[4783]: I0131 09:31:11.019876 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zp2tb\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:11 crc kubenswrapper[4783]: I0131 09:31:11.019900 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7srr\" (UniqueName: \"kubernetes.io/projected/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-kube-api-access-q7srr\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zp2tb\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:11 crc kubenswrapper[4783]: I0131 09:31:11.020778 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zp2tb\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:11 crc kubenswrapper[4783]: I0131 09:31:11.023555 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zp2tb\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:11 crc kubenswrapper[4783]: I0131 09:31:11.024092 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zp2tb\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:11 crc kubenswrapper[4783]: I0131 09:31:11.024950 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zp2tb\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:11 crc kubenswrapper[4783]: I0131 09:31:11.034320 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7srr\" (UniqueName: \"kubernetes.io/projected/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-kube-api-access-q7srr\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zp2tb\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:11 crc kubenswrapper[4783]: I0131 09:31:11.169717 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:11 crc kubenswrapper[4783]: I0131 09:31:11.628415 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb"] Jan 31 09:31:11 crc kubenswrapper[4783]: I0131 09:31:11.713858 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" event={"ID":"cf5af9ea-73f9-4316-8fdc-abe4ede8632a","Type":"ContainerStarted","Data":"51dd149386ef2e4d61ac3354b7218ffde8af12096117fe238d7d77f0675c2afd"} Jan 31 09:31:12 crc kubenswrapper[4783]: I0131 09:31:12.723566 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" event={"ID":"cf5af9ea-73f9-4316-8fdc-abe4ede8632a","Type":"ContainerStarted","Data":"9173cbb38addbfa704b66a4d399d570897cca50743a1328c9733d1a39611992d"} Jan 31 09:31:12 crc kubenswrapper[4783]: I0131 09:31:12.742603 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" podStartSLOduration=2.158168505 podStartE2EDuration="2.742575071s" podCreationTimestamp="2026-01-31 09:31:10 +0000 UTC" firstStartedPulling="2026-01-31 09:31:11.632414337 +0000 UTC m=+1582.301097806" lastFinishedPulling="2026-01-31 09:31:12.216820904 +0000 UTC m=+1582.885504372" observedRunningTime="2026-01-31 09:31:12.737873695 +0000 UTC m=+1583.406557163" watchObservedRunningTime="2026-01-31 09:31:12.742575071 +0000 UTC m=+1583.411258539" Jan 31 09:31:17 crc kubenswrapper[4783]: I0131 09:31:17.756674 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:31:17 crc kubenswrapper[4783]: I0131 09:31:17.757458 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:31:17 crc kubenswrapper[4783]: I0131 09:31:17.757538 4783 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:31:17 crc kubenswrapper[4783]: I0131 09:31:17.758774 4783 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b"} pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:31:17 crc kubenswrapper[4783]: I0131 09:31:17.758854 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" containerID="cri-o://8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" gracePeriod=600 Jan 31 09:31:17 crc kubenswrapper[4783]: E0131 09:31:17.878944 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:31:18 crc kubenswrapper[4783]: I0131 09:31:18.774268 4783 generic.go:334] "Generic (PLEG): container finished" podID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" exitCode=0 Jan 31 09:31:18 crc kubenswrapper[4783]: I0131 09:31:18.774330 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerDied","Data":"8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b"} Jan 31 09:31:18 crc kubenswrapper[4783]: I0131 09:31:18.774755 4783 scope.go:117] "RemoveContainer" containerID="88bb33d5640838b31dc21da454049d6e8053db2a99a4f0698c705ac33568abca" Jan 31 09:31:18 crc kubenswrapper[4783]: I0131 09:31:18.775276 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:31:18 crc kubenswrapper[4783]: E0131 09:31:18.775526 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:31:24 crc kubenswrapper[4783]: I0131 09:31:24.036401 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-h5qh5"] Jan 31 09:31:24 crc kubenswrapper[4783]: I0131 09:31:24.043235 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-h5qh5"] Jan 31 09:31:25 crc kubenswrapper[4783]: I0131 09:31:25.655010 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="813d6a18-c983-4731-ae74-ca671d822949" path="/var/lib/kubelet/pods/813d6a18-c983-4731-ae74-ca671d822949/volumes" Jan 31 09:31:33 crc kubenswrapper[4783]: I0131 09:31:33.646621 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:31:33 crc kubenswrapper[4783]: E0131 09:31:33.647505 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:31:44 crc kubenswrapper[4783]: I0131 09:31:44.646097 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:31:44 crc kubenswrapper[4783]: E0131 09:31:44.646966 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:31:55 crc kubenswrapper[4783]: I0131 09:31:55.646073 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:31:55 crc kubenswrapper[4783]: E0131 09:31:55.646968 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:31:57 crc kubenswrapper[4783]: I0131 09:31:57.105324 4783 generic.go:334] "Generic (PLEG): container finished" podID="cf5af9ea-73f9-4316-8fdc-abe4ede8632a" containerID="9173cbb38addbfa704b66a4d399d570897cca50743a1328c9733d1a39611992d" exitCode=0 Jan 31 09:31:57 crc kubenswrapper[4783]: I0131 09:31:57.105366 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" event={"ID":"cf5af9ea-73f9-4316-8fdc-abe4ede8632a","Type":"ContainerDied","Data":"9173cbb38addbfa704b66a4d399d570897cca50743a1328c9733d1a39611992d"} Jan 31 09:31:58 crc kubenswrapper[4783]: I0131 09:31:58.431498 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:58 crc kubenswrapper[4783]: I0131 09:31:58.578996 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7srr\" (UniqueName: \"kubernetes.io/projected/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-kube-api-access-q7srr\") pod \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " Jan 31 09:31:58 crc kubenswrapper[4783]: I0131 09:31:58.579061 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-inventory\") pod \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " Jan 31 09:31:58 crc kubenswrapper[4783]: I0131 09:31:58.579099 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ssh-key-openstack-edpm-ipam\") pod \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " Jan 31 09:31:58 crc kubenswrapper[4783]: I0131 09:31:58.579201 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ovncontroller-config-0\") pod \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " Jan 31 09:31:58 crc kubenswrapper[4783]: I0131 09:31:58.579265 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ovn-combined-ca-bundle\") pod \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\" (UID: \"cf5af9ea-73f9-4316-8fdc-abe4ede8632a\") " Jan 31 09:31:58 crc kubenswrapper[4783]: I0131 09:31:58.585258 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-kube-api-access-q7srr" (OuterVolumeSpecName: "kube-api-access-q7srr") pod "cf5af9ea-73f9-4316-8fdc-abe4ede8632a" (UID: "cf5af9ea-73f9-4316-8fdc-abe4ede8632a"). InnerVolumeSpecName "kube-api-access-q7srr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:31:58 crc kubenswrapper[4783]: I0131 09:31:58.585806 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "cf5af9ea-73f9-4316-8fdc-abe4ede8632a" (UID: "cf5af9ea-73f9-4316-8fdc-abe4ede8632a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:31:58 crc kubenswrapper[4783]: I0131 09:31:58.603240 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "cf5af9ea-73f9-4316-8fdc-abe4ede8632a" (UID: "cf5af9ea-73f9-4316-8fdc-abe4ede8632a"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:31:58 crc kubenswrapper[4783]: I0131 09:31:58.603258 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-inventory" (OuterVolumeSpecName: "inventory") pod "cf5af9ea-73f9-4316-8fdc-abe4ede8632a" (UID: "cf5af9ea-73f9-4316-8fdc-abe4ede8632a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:31:58 crc kubenswrapper[4783]: I0131 09:31:58.604898 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cf5af9ea-73f9-4316-8fdc-abe4ede8632a" (UID: "cf5af9ea-73f9-4316-8fdc-abe4ede8632a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:31:58 crc kubenswrapper[4783]: I0131 09:31:58.681408 4783 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:58 crc kubenswrapper[4783]: I0131 09:31:58.681448 4783 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:58 crc kubenswrapper[4783]: I0131 09:31:58.681459 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7srr\" (UniqueName: \"kubernetes.io/projected/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-kube-api-access-q7srr\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:58 crc kubenswrapper[4783]: I0131 09:31:58.681468 4783 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:58 crc kubenswrapper[4783]: I0131 09:31:58.681478 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf5af9ea-73f9-4316-8fdc-abe4ede8632a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.122325 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" event={"ID":"cf5af9ea-73f9-4316-8fdc-abe4ede8632a","Type":"ContainerDied","Data":"51dd149386ef2e4d61ac3354b7218ffde8af12096117fe238d7d77f0675c2afd"} Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.122667 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51dd149386ef2e4d61ac3354b7218ffde8af12096117fe238d7d77f0675c2afd" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.122411 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zp2tb" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.241308 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2"] Jan 31 09:31:59 crc kubenswrapper[4783]: E0131 09:31:59.241674 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5af9ea-73f9-4316-8fdc-abe4ede8632a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.241691 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5af9ea-73f9-4316-8fdc-abe4ede8632a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.241890 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf5af9ea-73f9-4316-8fdc-abe4ede8632a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.242537 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.245450 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.245666 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.245870 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.246042 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.246305 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.246918 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.248842 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2"] Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.292222 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.292315 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg45t\" (UniqueName: \"kubernetes.io/projected/a22a7456-83e3-46ef-80c2-ebea731972b9-kube-api-access-sg45t\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.292387 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.292417 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.292469 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.292496 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.395116 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.395247 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg45t\" (UniqueName: \"kubernetes.io/projected/a22a7456-83e3-46ef-80c2-ebea731972b9-kube-api-access-sg45t\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.395323 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.395349 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.395382 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.395402 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.399469 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.399765 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.399960 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.400031 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.402192 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.410881 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg45t\" (UniqueName: \"kubernetes.io/projected/a22a7456-83e3-46ef-80c2-ebea731972b9-kube-api-access-sg45t\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:31:59 crc kubenswrapper[4783]: I0131 09:31:59.558270 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:32:00 crc kubenswrapper[4783]: I0131 09:32:00.017994 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2"] Jan 31 09:32:00 crc kubenswrapper[4783]: I0131 09:32:00.133032 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" event={"ID":"a22a7456-83e3-46ef-80c2-ebea731972b9","Type":"ContainerStarted","Data":"4e790ac0eab9bda8ff577cdf873949ed9dcd3d06cb53ee38849c55d06bb179a8"} Jan 31 09:32:00 crc kubenswrapper[4783]: I0131 09:32:00.744097 4783 scope.go:117] "RemoveContainer" containerID="b2fa835bac2c2f9da8ceca08b17a1aa1557c3d284a9b06ac7a38e39b29237280" Jan 31 09:32:01 crc kubenswrapper[4783]: I0131 09:32:01.142754 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" event={"ID":"a22a7456-83e3-46ef-80c2-ebea731972b9","Type":"ContainerStarted","Data":"02c5cb8d9626a430ec44c7ca79df16d145d7dc514f7d2e353a7df20dbcf66424"} Jan 31 09:32:01 crc kubenswrapper[4783]: I0131 09:32:01.159607 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" podStartSLOduration=1.6932298019999998 podStartE2EDuration="2.159590042s" podCreationTimestamp="2026-01-31 09:31:59 +0000 UTC" firstStartedPulling="2026-01-31 09:32:00.023781021 +0000 UTC m=+1630.692464479" lastFinishedPulling="2026-01-31 09:32:00.490141251 +0000 UTC m=+1631.158824719" observedRunningTime="2026-01-31 09:32:01.157923209 +0000 UTC m=+1631.826606677" watchObservedRunningTime="2026-01-31 09:32:01.159590042 +0000 UTC m=+1631.828273510" Jan 31 09:32:07 crc kubenswrapper[4783]: I0131 09:32:07.646220 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:32:07 crc kubenswrapper[4783]: E0131 09:32:07.646985 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:32:20 crc kubenswrapper[4783]: I0131 09:32:20.646081 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:32:20 crc kubenswrapper[4783]: E0131 09:32:20.646896 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:32:32 crc kubenswrapper[4783]: I0131 09:32:32.645567 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:32:32 crc kubenswrapper[4783]: E0131 09:32:32.646931 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:32:34 crc kubenswrapper[4783]: I0131 09:32:34.404281 4783 generic.go:334] "Generic (PLEG): container finished" podID="a22a7456-83e3-46ef-80c2-ebea731972b9" containerID="02c5cb8d9626a430ec44c7ca79df16d145d7dc514f7d2e353a7df20dbcf66424" exitCode=0 Jan 31 09:32:34 crc kubenswrapper[4783]: I0131 09:32:34.404350 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" event={"ID":"a22a7456-83e3-46ef-80c2-ebea731972b9","Type":"ContainerDied","Data":"02c5cb8d9626a430ec44c7ca79df16d145d7dc514f7d2e353a7df20dbcf66424"} Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.754037 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.876101 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-neutron-metadata-combined-ca-bundle\") pod \"a22a7456-83e3-46ef-80c2-ebea731972b9\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.876196 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-ssh-key-openstack-edpm-ipam\") pod \"a22a7456-83e3-46ef-80c2-ebea731972b9\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.876289 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-inventory\") pod \"a22a7456-83e3-46ef-80c2-ebea731972b9\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.876320 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg45t\" (UniqueName: \"kubernetes.io/projected/a22a7456-83e3-46ef-80c2-ebea731972b9-kube-api-access-sg45t\") pod \"a22a7456-83e3-46ef-80c2-ebea731972b9\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.876459 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"a22a7456-83e3-46ef-80c2-ebea731972b9\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.876526 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-nova-metadata-neutron-config-0\") pod \"a22a7456-83e3-46ef-80c2-ebea731972b9\" (UID: \"a22a7456-83e3-46ef-80c2-ebea731972b9\") " Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.882323 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a22a7456-83e3-46ef-80c2-ebea731972b9" (UID: "a22a7456-83e3-46ef-80c2-ebea731972b9"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.884612 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a22a7456-83e3-46ef-80c2-ebea731972b9-kube-api-access-sg45t" (OuterVolumeSpecName: "kube-api-access-sg45t") pod "a22a7456-83e3-46ef-80c2-ebea731972b9" (UID: "a22a7456-83e3-46ef-80c2-ebea731972b9"). InnerVolumeSpecName "kube-api-access-sg45t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.899823 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "a22a7456-83e3-46ef-80c2-ebea731972b9" (UID: "a22a7456-83e3-46ef-80c2-ebea731972b9"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.900156 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-inventory" (OuterVolumeSpecName: "inventory") pod "a22a7456-83e3-46ef-80c2-ebea731972b9" (UID: "a22a7456-83e3-46ef-80c2-ebea731972b9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.901075 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a22a7456-83e3-46ef-80c2-ebea731972b9" (UID: "a22a7456-83e3-46ef-80c2-ebea731972b9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.901366 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "a22a7456-83e3-46ef-80c2-ebea731972b9" (UID: "a22a7456-83e3-46ef-80c2-ebea731972b9"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.978924 4783 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.978961 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.978973 4783 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.978982 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg45t\" (UniqueName: \"kubernetes.io/projected/a22a7456-83e3-46ef-80c2-ebea731972b9-kube-api-access-sg45t\") on node \"crc\" DevicePath \"\"" Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.978990 4783 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:32:35 crc kubenswrapper[4783]: I0131 09:32:35.979000 4783 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a22a7456-83e3-46ef-80c2-ebea731972b9-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.421294 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" event={"ID":"a22a7456-83e3-46ef-80c2-ebea731972b9","Type":"ContainerDied","Data":"4e790ac0eab9bda8ff577cdf873949ed9dcd3d06cb53ee38849c55d06bb179a8"} Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.421329 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.421335 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e790ac0eab9bda8ff577cdf873949ed9dcd3d06cb53ee38849c55d06bb179a8" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.494965 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm"] Jan 31 09:32:36 crc kubenswrapper[4783]: E0131 09:32:36.495284 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22a7456-83e3-46ef-80c2-ebea731972b9" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.495300 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22a7456-83e3-46ef-80c2-ebea731972b9" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.495467 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="a22a7456-83e3-46ef-80c2-ebea731972b9" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.495995 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.500535 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.501155 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.501810 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.502370 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.502569 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.527952 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm"] Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.591509 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z29qg\" (UniqueName: \"kubernetes.io/projected/777883d7-012b-4006-afcb-d5fcd8a0eb68-kube-api-access-z29qg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.591782 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.592225 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.592265 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.592531 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.694853 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z29qg\" (UniqueName: \"kubernetes.io/projected/777883d7-012b-4006-afcb-d5fcd8a0eb68-kube-api-access-z29qg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.694951 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.695043 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.695065 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.695154 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.698601 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.699027 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.699068 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.700260 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.708611 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z29qg\" (UniqueName: \"kubernetes.io/projected/777883d7-012b-4006-afcb-d5fcd8a0eb68-kube-api-access-z29qg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:36 crc kubenswrapper[4783]: I0131 09:32:36.814964 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:32:37 crc kubenswrapper[4783]: I0131 09:32:37.253974 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm"] Jan 31 09:32:37 crc kubenswrapper[4783]: I0131 09:32:37.429653 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" event={"ID":"777883d7-012b-4006-afcb-d5fcd8a0eb68","Type":"ContainerStarted","Data":"165533a324c9c63f6d6119e9c6ac1ddbf6df7e397cfd02cf0c1420d88d556128"} Jan 31 09:32:38 crc kubenswrapper[4783]: I0131 09:32:38.440105 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" event={"ID":"777883d7-012b-4006-afcb-d5fcd8a0eb68","Type":"ContainerStarted","Data":"2e0f4c88b9d135423e886a7fb83d55a49c7783bf68081f2b60a6469366b3f052"} Jan 31 09:32:38 crc kubenswrapper[4783]: I0131 09:32:38.454513 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" podStartSLOduration=1.9126299439999999 podStartE2EDuration="2.454495946s" podCreationTimestamp="2026-01-31 09:32:36 +0000 UTC" firstStartedPulling="2026-01-31 09:32:37.259185043 +0000 UTC m=+1667.927868511" lastFinishedPulling="2026-01-31 09:32:37.801051045 +0000 UTC m=+1668.469734513" observedRunningTime="2026-01-31 09:32:38.450611681 +0000 UTC m=+1669.119295149" watchObservedRunningTime="2026-01-31 09:32:38.454495946 +0000 UTC m=+1669.123179413" Jan 31 09:32:45 crc kubenswrapper[4783]: I0131 09:32:45.646011 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:32:45 crc kubenswrapper[4783]: E0131 09:32:45.646787 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:32:59 crc kubenswrapper[4783]: I0131 09:32:59.652149 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:32:59 crc kubenswrapper[4783]: E0131 09:32:59.653107 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:33:10 crc kubenswrapper[4783]: I0131 09:33:10.646394 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:33:10 crc kubenswrapper[4783]: E0131 09:33:10.647475 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:33:21 crc kubenswrapper[4783]: I0131 09:33:21.657397 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:33:21 crc kubenswrapper[4783]: E0131 09:33:21.659274 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:33:34 crc kubenswrapper[4783]: I0131 09:33:34.646444 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:33:34 crc kubenswrapper[4783]: E0131 09:33:34.647016 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:33:47 crc kubenswrapper[4783]: I0131 09:33:47.645897 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:33:47 crc kubenswrapper[4783]: E0131 09:33:47.647480 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:34:02 crc kubenswrapper[4783]: I0131 09:34:02.645863 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:34:02 crc kubenswrapper[4783]: E0131 09:34:02.646536 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:34:14 crc kubenswrapper[4783]: I0131 09:34:14.645695 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:34:14 crc kubenswrapper[4783]: E0131 09:34:14.646387 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:34:26 crc kubenswrapper[4783]: I0131 09:34:26.645772 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:34:26 crc kubenswrapper[4783]: E0131 09:34:26.646523 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:34:41 crc kubenswrapper[4783]: I0131 09:34:41.646003 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:34:41 crc kubenswrapper[4783]: E0131 09:34:41.646748 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:34:52 crc kubenswrapper[4783]: I0131 09:34:52.646289 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:34:52 crc kubenswrapper[4783]: E0131 09:34:52.647052 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:35:03 crc kubenswrapper[4783]: I0131 09:35:03.645908 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:35:03 crc kubenswrapper[4783]: E0131 09:35:03.647148 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:35:17 crc kubenswrapper[4783]: I0131 09:35:17.646206 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:35:17 crc kubenswrapper[4783]: E0131 09:35:17.647291 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:35:23 crc kubenswrapper[4783]: I0131 09:35:23.714720 4783 generic.go:334] "Generic (PLEG): container finished" podID="777883d7-012b-4006-afcb-d5fcd8a0eb68" containerID="2e0f4c88b9d135423e886a7fb83d55a49c7783bf68081f2b60a6469366b3f052" exitCode=0 Jan 31 09:35:23 crc kubenswrapper[4783]: I0131 09:35:23.714783 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" event={"ID":"777883d7-012b-4006-afcb-d5fcd8a0eb68","Type":"ContainerDied","Data":"2e0f4c88b9d135423e886a7fb83d55a49c7783bf68081f2b60a6469366b3f052"} Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.150006 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.191882 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z29qg\" (UniqueName: \"kubernetes.io/projected/777883d7-012b-4006-afcb-d5fcd8a0eb68-kube-api-access-z29qg\") pod \"777883d7-012b-4006-afcb-d5fcd8a0eb68\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.192118 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-libvirt-secret-0\") pod \"777883d7-012b-4006-afcb-d5fcd8a0eb68\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.192287 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-inventory\") pod \"777883d7-012b-4006-afcb-d5fcd8a0eb68\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.192394 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-libvirt-combined-ca-bundle\") pod \"777883d7-012b-4006-afcb-d5fcd8a0eb68\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.192479 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-ssh-key-openstack-edpm-ipam\") pod \"777883d7-012b-4006-afcb-d5fcd8a0eb68\" (UID: \"777883d7-012b-4006-afcb-d5fcd8a0eb68\") " Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.197097 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "777883d7-012b-4006-afcb-d5fcd8a0eb68" (UID: "777883d7-012b-4006-afcb-d5fcd8a0eb68"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.198705 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/777883d7-012b-4006-afcb-d5fcd8a0eb68-kube-api-access-z29qg" (OuterVolumeSpecName: "kube-api-access-z29qg") pod "777883d7-012b-4006-afcb-d5fcd8a0eb68" (UID: "777883d7-012b-4006-afcb-d5fcd8a0eb68"). InnerVolumeSpecName "kube-api-access-z29qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.213791 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-inventory" (OuterVolumeSpecName: "inventory") pod "777883d7-012b-4006-afcb-d5fcd8a0eb68" (UID: "777883d7-012b-4006-afcb-d5fcd8a0eb68"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.215150 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "777883d7-012b-4006-afcb-d5fcd8a0eb68" (UID: "777883d7-012b-4006-afcb-d5fcd8a0eb68"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.215387 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "777883d7-012b-4006-afcb-d5fcd8a0eb68" (UID: "777883d7-012b-4006-afcb-d5fcd8a0eb68"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.293679 4783 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.293707 4783 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.293722 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.293732 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z29qg\" (UniqueName: \"kubernetes.io/projected/777883d7-012b-4006-afcb-d5fcd8a0eb68-kube-api-access-z29qg\") on node \"crc\" DevicePath \"\"" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.293741 4783 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/777883d7-012b-4006-afcb-d5fcd8a0eb68-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.730294 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" event={"ID":"777883d7-012b-4006-afcb-d5fcd8a0eb68","Type":"ContainerDied","Data":"165533a324c9c63f6d6119e9c6ac1ddbf6df7e397cfd02cf0c1420d88d556128"} Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.730598 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="165533a324c9c63f6d6119e9c6ac1ddbf6df7e397cfd02cf0c1420d88d556128" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.730327 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.795061 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4"] Jan 31 09:35:25 crc kubenswrapper[4783]: E0131 09:35:25.795411 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="777883d7-012b-4006-afcb-d5fcd8a0eb68" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.795429 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="777883d7-012b-4006-afcb-d5fcd8a0eb68" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.795593 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="777883d7-012b-4006-afcb-d5fcd8a0eb68" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.796098 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.798464 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.799839 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.799959 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.800003 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.799847 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.800796 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.802970 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.807224 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4"] Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.902589 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.902663 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.902708 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.902810 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjrcl\" (UniqueName: \"kubernetes.io/projected/57ec9c0f-9c30-4c10-afd7-84ac778f9069-kube-api-access-tjrcl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.902857 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.902899 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.902961 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.903014 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:25 crc kubenswrapper[4783]: I0131 09:35:25.903102 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.005049 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.005134 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.005194 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.005246 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.005318 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjrcl\" (UniqueName: \"kubernetes.io/projected/57ec9c0f-9c30-4c10-afd7-84ac778f9069-kube-api-access-tjrcl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.005354 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.005388 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.005443 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.005491 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.006760 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.008992 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.009344 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.009864 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.010132 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.010387 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.010409 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.010475 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.019696 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjrcl\" (UniqueName: \"kubernetes.io/projected/57ec9c0f-9c30-4c10-afd7-84ac778f9069-kube-api-access-tjrcl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q6sd4\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.108775 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.540555 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4"] Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.545379 4783 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:35:26 crc kubenswrapper[4783]: I0131 09:35:26.738005 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" event={"ID":"57ec9c0f-9c30-4c10-afd7-84ac778f9069","Type":"ContainerStarted","Data":"38a63270bc9150cf444e008cee84707b29c77bd869b5ce72586e07f20bc3f8d3"} Jan 31 09:35:27 crc kubenswrapper[4783]: I0131 09:35:27.745146 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" event={"ID":"57ec9c0f-9c30-4c10-afd7-84ac778f9069","Type":"ContainerStarted","Data":"770088b7c54e6733444fff8ec81aeea8cdc4b4cfec9e9d8c5075dd5ab2142e82"} Jan 31 09:35:29 crc kubenswrapper[4783]: I0131 09:35:29.652622 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:35:29 crc kubenswrapper[4783]: E0131 09:35:29.653455 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:35:42 crc kubenswrapper[4783]: I0131 09:35:42.645659 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:35:42 crc kubenswrapper[4783]: E0131 09:35:42.646606 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:35:54 crc kubenswrapper[4783]: I0131 09:35:54.646055 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:35:54 crc kubenswrapper[4783]: E0131 09:35:54.646979 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:36:01 crc kubenswrapper[4783]: I0131 09:36:01.845890 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" podStartSLOduration=36.300232694 podStartE2EDuration="36.845874865s" podCreationTimestamp="2026-01-31 09:35:25 +0000 UTC" firstStartedPulling="2026-01-31 09:35:26.545074431 +0000 UTC m=+1837.213757899" lastFinishedPulling="2026-01-31 09:35:27.090716602 +0000 UTC m=+1837.759400070" observedRunningTime="2026-01-31 09:35:27.767987081 +0000 UTC m=+1838.436670549" watchObservedRunningTime="2026-01-31 09:36:01.845874865 +0000 UTC m=+1872.514558333" Jan 31 09:36:01 crc kubenswrapper[4783]: I0131 09:36:01.847333 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4wk6b"] Jan 31 09:36:01 crc kubenswrapper[4783]: I0131 09:36:01.848977 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:01 crc kubenswrapper[4783]: I0131 09:36:01.859292 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4wk6b"] Jan 31 09:36:01 crc kubenswrapper[4783]: I0131 09:36:01.924547 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65697295-79b1-47af-887f-531827ecc46f-utilities\") pod \"community-operators-4wk6b\" (UID: \"65697295-79b1-47af-887f-531827ecc46f\") " pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:01 crc kubenswrapper[4783]: I0131 09:36:01.924834 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65697295-79b1-47af-887f-531827ecc46f-catalog-content\") pod \"community-operators-4wk6b\" (UID: \"65697295-79b1-47af-887f-531827ecc46f\") " pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:01 crc kubenswrapper[4783]: I0131 09:36:01.924891 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v7c7\" (UniqueName: \"kubernetes.io/projected/65697295-79b1-47af-887f-531827ecc46f-kube-api-access-7v7c7\") pod \"community-operators-4wk6b\" (UID: \"65697295-79b1-47af-887f-531827ecc46f\") " pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:02 crc kubenswrapper[4783]: I0131 09:36:02.026211 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65697295-79b1-47af-887f-531827ecc46f-catalog-content\") pod \"community-operators-4wk6b\" (UID: \"65697295-79b1-47af-887f-531827ecc46f\") " pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:02 crc kubenswrapper[4783]: I0131 09:36:02.026314 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v7c7\" (UniqueName: \"kubernetes.io/projected/65697295-79b1-47af-887f-531827ecc46f-kube-api-access-7v7c7\") pod \"community-operators-4wk6b\" (UID: \"65697295-79b1-47af-887f-531827ecc46f\") " pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:02 crc kubenswrapper[4783]: I0131 09:36:02.026437 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65697295-79b1-47af-887f-531827ecc46f-utilities\") pod \"community-operators-4wk6b\" (UID: \"65697295-79b1-47af-887f-531827ecc46f\") " pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:02 crc kubenswrapper[4783]: I0131 09:36:02.026708 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65697295-79b1-47af-887f-531827ecc46f-catalog-content\") pod \"community-operators-4wk6b\" (UID: \"65697295-79b1-47af-887f-531827ecc46f\") " pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:02 crc kubenswrapper[4783]: I0131 09:36:02.026785 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65697295-79b1-47af-887f-531827ecc46f-utilities\") pod \"community-operators-4wk6b\" (UID: \"65697295-79b1-47af-887f-531827ecc46f\") " pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:02 crc kubenswrapper[4783]: I0131 09:36:02.045637 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v7c7\" (UniqueName: \"kubernetes.io/projected/65697295-79b1-47af-887f-531827ecc46f-kube-api-access-7v7c7\") pod \"community-operators-4wk6b\" (UID: \"65697295-79b1-47af-887f-531827ecc46f\") " pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:02 crc kubenswrapper[4783]: I0131 09:36:02.166567 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:02 crc kubenswrapper[4783]: I0131 09:36:02.613058 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4wk6b"] Jan 31 09:36:02 crc kubenswrapper[4783]: I0131 09:36:02.994348 4783 generic.go:334] "Generic (PLEG): container finished" podID="65697295-79b1-47af-887f-531827ecc46f" containerID="1f81f97475cd790cbb091e51ddb983e14a9c8e648c77c6a76ea6f4135d1b7212" exitCode=0 Jan 31 09:36:02 crc kubenswrapper[4783]: I0131 09:36:02.994500 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wk6b" event={"ID":"65697295-79b1-47af-887f-531827ecc46f","Type":"ContainerDied","Data":"1f81f97475cd790cbb091e51ddb983e14a9c8e648c77c6a76ea6f4135d1b7212"} Jan 31 09:36:02 crc kubenswrapper[4783]: I0131 09:36:02.994715 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wk6b" event={"ID":"65697295-79b1-47af-887f-531827ecc46f","Type":"ContainerStarted","Data":"c2a6c4d3b3f8ba540d53ef1b1953447f48c7e93bf005b3f83104f25266106ef9"} Jan 31 09:36:04 crc kubenswrapper[4783]: I0131 09:36:04.003214 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wk6b" event={"ID":"65697295-79b1-47af-887f-531827ecc46f","Type":"ContainerStarted","Data":"382d5ee114aae7dfec4a6ee604b769294ac99b8614508cd2f56fe5c97072bdee"} Jan 31 09:36:05 crc kubenswrapper[4783]: I0131 09:36:05.011850 4783 generic.go:334] "Generic (PLEG): container finished" podID="65697295-79b1-47af-887f-531827ecc46f" containerID="382d5ee114aae7dfec4a6ee604b769294ac99b8614508cd2f56fe5c97072bdee" exitCode=0 Jan 31 09:36:05 crc kubenswrapper[4783]: I0131 09:36:05.011940 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wk6b" event={"ID":"65697295-79b1-47af-887f-531827ecc46f","Type":"ContainerDied","Data":"382d5ee114aae7dfec4a6ee604b769294ac99b8614508cd2f56fe5c97072bdee"} Jan 31 09:36:05 crc kubenswrapper[4783]: I0131 09:36:05.646633 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:36:05 crc kubenswrapper[4783]: E0131 09:36:05.647061 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:36:06 crc kubenswrapper[4783]: I0131 09:36:06.022566 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wk6b" event={"ID":"65697295-79b1-47af-887f-531827ecc46f","Type":"ContainerStarted","Data":"289e72d9e74699f9bf512ee222825ef3cfffbd23f7f8cdbf2e726a55d2d52345"} Jan 31 09:36:06 crc kubenswrapper[4783]: I0131 09:36:06.041222 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4wk6b" podStartSLOduration=2.5580883439999997 podStartE2EDuration="5.041206129s" podCreationTimestamp="2026-01-31 09:36:01 +0000 UTC" firstStartedPulling="2026-01-31 09:36:02.996753333 +0000 UTC m=+1873.665436811" lastFinishedPulling="2026-01-31 09:36:05.479871128 +0000 UTC m=+1876.148554596" observedRunningTime="2026-01-31 09:36:06.035310671 +0000 UTC m=+1876.703994140" watchObservedRunningTime="2026-01-31 09:36:06.041206129 +0000 UTC m=+1876.709889597" Jan 31 09:36:12 crc kubenswrapper[4783]: I0131 09:36:12.167302 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:12 crc kubenswrapper[4783]: I0131 09:36:12.168233 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:12 crc kubenswrapper[4783]: I0131 09:36:12.201867 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:13 crc kubenswrapper[4783]: I0131 09:36:13.112241 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:13 crc kubenswrapper[4783]: I0131 09:36:13.436307 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4wk6b"] Jan 31 09:36:15 crc kubenswrapper[4783]: I0131 09:36:15.097792 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4wk6b" podUID="65697295-79b1-47af-887f-531827ecc46f" containerName="registry-server" containerID="cri-o://289e72d9e74699f9bf512ee222825ef3cfffbd23f7f8cdbf2e726a55d2d52345" gracePeriod=2 Jan 31 09:36:15 crc kubenswrapper[4783]: I0131 09:36:15.548778 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:15 crc kubenswrapper[4783]: I0131 09:36:15.690978 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65697295-79b1-47af-887f-531827ecc46f-catalog-content\") pod \"65697295-79b1-47af-887f-531827ecc46f\" (UID: \"65697295-79b1-47af-887f-531827ecc46f\") " Jan 31 09:36:15 crc kubenswrapper[4783]: I0131 09:36:15.691096 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v7c7\" (UniqueName: \"kubernetes.io/projected/65697295-79b1-47af-887f-531827ecc46f-kube-api-access-7v7c7\") pod \"65697295-79b1-47af-887f-531827ecc46f\" (UID: \"65697295-79b1-47af-887f-531827ecc46f\") " Jan 31 09:36:15 crc kubenswrapper[4783]: I0131 09:36:15.691138 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65697295-79b1-47af-887f-531827ecc46f-utilities\") pod \"65697295-79b1-47af-887f-531827ecc46f\" (UID: \"65697295-79b1-47af-887f-531827ecc46f\") " Jan 31 09:36:15 crc kubenswrapper[4783]: I0131 09:36:15.692216 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65697295-79b1-47af-887f-531827ecc46f-utilities" (OuterVolumeSpecName: "utilities") pod "65697295-79b1-47af-887f-531827ecc46f" (UID: "65697295-79b1-47af-887f-531827ecc46f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:36:15 crc kubenswrapper[4783]: I0131 09:36:15.697056 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65697295-79b1-47af-887f-531827ecc46f-kube-api-access-7v7c7" (OuterVolumeSpecName: "kube-api-access-7v7c7") pod "65697295-79b1-47af-887f-531827ecc46f" (UID: "65697295-79b1-47af-887f-531827ecc46f"). InnerVolumeSpecName "kube-api-access-7v7c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:36:15 crc kubenswrapper[4783]: I0131 09:36:15.731916 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65697295-79b1-47af-887f-531827ecc46f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65697295-79b1-47af-887f-531827ecc46f" (UID: "65697295-79b1-47af-887f-531827ecc46f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:36:15 crc kubenswrapper[4783]: I0131 09:36:15.794005 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65697295-79b1-47af-887f-531827ecc46f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:36:15 crc kubenswrapper[4783]: I0131 09:36:15.794068 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v7c7\" (UniqueName: \"kubernetes.io/projected/65697295-79b1-47af-887f-531827ecc46f-kube-api-access-7v7c7\") on node \"crc\" DevicePath \"\"" Jan 31 09:36:15 crc kubenswrapper[4783]: I0131 09:36:15.794086 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65697295-79b1-47af-887f-531827ecc46f-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:36:16 crc kubenswrapper[4783]: I0131 09:36:16.107561 4783 generic.go:334] "Generic (PLEG): container finished" podID="65697295-79b1-47af-887f-531827ecc46f" containerID="289e72d9e74699f9bf512ee222825ef3cfffbd23f7f8cdbf2e726a55d2d52345" exitCode=0 Jan 31 09:36:16 crc kubenswrapper[4783]: I0131 09:36:16.107613 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wk6b" event={"ID":"65697295-79b1-47af-887f-531827ecc46f","Type":"ContainerDied","Data":"289e72d9e74699f9bf512ee222825ef3cfffbd23f7f8cdbf2e726a55d2d52345"} Jan 31 09:36:16 crc kubenswrapper[4783]: I0131 09:36:16.107622 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4wk6b" Jan 31 09:36:16 crc kubenswrapper[4783]: I0131 09:36:16.107644 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wk6b" event={"ID":"65697295-79b1-47af-887f-531827ecc46f","Type":"ContainerDied","Data":"c2a6c4d3b3f8ba540d53ef1b1953447f48c7e93bf005b3f83104f25266106ef9"} Jan 31 09:36:16 crc kubenswrapper[4783]: I0131 09:36:16.107662 4783 scope.go:117] "RemoveContainer" containerID="289e72d9e74699f9bf512ee222825ef3cfffbd23f7f8cdbf2e726a55d2d52345" Jan 31 09:36:16 crc kubenswrapper[4783]: I0131 09:36:16.130469 4783 scope.go:117] "RemoveContainer" containerID="382d5ee114aae7dfec4a6ee604b769294ac99b8614508cd2f56fe5c97072bdee" Jan 31 09:36:16 crc kubenswrapper[4783]: I0131 09:36:16.144042 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4wk6b"] Jan 31 09:36:16 crc kubenswrapper[4783]: I0131 09:36:16.151244 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4wk6b"] Jan 31 09:36:16 crc kubenswrapper[4783]: I0131 09:36:16.163470 4783 scope.go:117] "RemoveContainer" containerID="1f81f97475cd790cbb091e51ddb983e14a9c8e648c77c6a76ea6f4135d1b7212" Jan 31 09:36:16 crc kubenswrapper[4783]: I0131 09:36:16.195068 4783 scope.go:117] "RemoveContainer" containerID="289e72d9e74699f9bf512ee222825ef3cfffbd23f7f8cdbf2e726a55d2d52345" Jan 31 09:36:16 crc kubenswrapper[4783]: E0131 09:36:16.195590 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"289e72d9e74699f9bf512ee222825ef3cfffbd23f7f8cdbf2e726a55d2d52345\": container with ID starting with 289e72d9e74699f9bf512ee222825ef3cfffbd23f7f8cdbf2e726a55d2d52345 not found: ID does not exist" containerID="289e72d9e74699f9bf512ee222825ef3cfffbd23f7f8cdbf2e726a55d2d52345" Jan 31 09:36:16 crc kubenswrapper[4783]: I0131 09:36:16.195629 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"289e72d9e74699f9bf512ee222825ef3cfffbd23f7f8cdbf2e726a55d2d52345"} err="failed to get container status \"289e72d9e74699f9bf512ee222825ef3cfffbd23f7f8cdbf2e726a55d2d52345\": rpc error: code = NotFound desc = could not find container \"289e72d9e74699f9bf512ee222825ef3cfffbd23f7f8cdbf2e726a55d2d52345\": container with ID starting with 289e72d9e74699f9bf512ee222825ef3cfffbd23f7f8cdbf2e726a55d2d52345 not found: ID does not exist" Jan 31 09:36:16 crc kubenswrapper[4783]: I0131 09:36:16.195656 4783 scope.go:117] "RemoveContainer" containerID="382d5ee114aae7dfec4a6ee604b769294ac99b8614508cd2f56fe5c97072bdee" Jan 31 09:36:16 crc kubenswrapper[4783]: E0131 09:36:16.195985 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"382d5ee114aae7dfec4a6ee604b769294ac99b8614508cd2f56fe5c97072bdee\": container with ID starting with 382d5ee114aae7dfec4a6ee604b769294ac99b8614508cd2f56fe5c97072bdee not found: ID does not exist" containerID="382d5ee114aae7dfec4a6ee604b769294ac99b8614508cd2f56fe5c97072bdee" Jan 31 09:36:16 crc kubenswrapper[4783]: I0131 09:36:16.196015 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"382d5ee114aae7dfec4a6ee604b769294ac99b8614508cd2f56fe5c97072bdee"} err="failed to get container status \"382d5ee114aae7dfec4a6ee604b769294ac99b8614508cd2f56fe5c97072bdee\": rpc error: code = NotFound desc = could not find container \"382d5ee114aae7dfec4a6ee604b769294ac99b8614508cd2f56fe5c97072bdee\": container with ID starting with 382d5ee114aae7dfec4a6ee604b769294ac99b8614508cd2f56fe5c97072bdee not found: ID does not exist" Jan 31 09:36:16 crc kubenswrapper[4783]: I0131 09:36:16.196033 4783 scope.go:117] "RemoveContainer" containerID="1f81f97475cd790cbb091e51ddb983e14a9c8e648c77c6a76ea6f4135d1b7212" Jan 31 09:36:16 crc kubenswrapper[4783]: E0131 09:36:16.196508 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f81f97475cd790cbb091e51ddb983e14a9c8e648c77c6a76ea6f4135d1b7212\": container with ID starting with 1f81f97475cd790cbb091e51ddb983e14a9c8e648c77c6a76ea6f4135d1b7212 not found: ID does not exist" containerID="1f81f97475cd790cbb091e51ddb983e14a9c8e648c77c6a76ea6f4135d1b7212" Jan 31 09:36:16 crc kubenswrapper[4783]: I0131 09:36:16.196534 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f81f97475cd790cbb091e51ddb983e14a9c8e648c77c6a76ea6f4135d1b7212"} err="failed to get container status \"1f81f97475cd790cbb091e51ddb983e14a9c8e648c77c6a76ea6f4135d1b7212\": rpc error: code = NotFound desc = could not find container \"1f81f97475cd790cbb091e51ddb983e14a9c8e648c77c6a76ea6f4135d1b7212\": container with ID starting with 1f81f97475cd790cbb091e51ddb983e14a9c8e648c77c6a76ea6f4135d1b7212 not found: ID does not exist" Jan 31 09:36:17 crc kubenswrapper[4783]: I0131 09:36:17.655270 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65697295-79b1-47af-887f-531827ecc46f" path="/var/lib/kubelet/pods/65697295-79b1-47af-887f-531827ecc46f/volumes" Jan 31 09:36:18 crc kubenswrapper[4783]: I0131 09:36:18.646558 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:36:19 crc kubenswrapper[4783]: I0131 09:36:19.134087 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerStarted","Data":"7e1f687b072d842405d8baf7e29e05dc4192665ce37624bd66d9efc7fe830200"} Jan 31 09:37:03 crc kubenswrapper[4783]: I0131 09:37:03.451917 4783 generic.go:334] "Generic (PLEG): container finished" podID="57ec9c0f-9c30-4c10-afd7-84ac778f9069" containerID="770088b7c54e6733444fff8ec81aeea8cdc4b4cfec9e9d8c5075dd5ab2142e82" exitCode=0 Jan 31 09:37:03 crc kubenswrapper[4783]: I0131 09:37:03.452009 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" event={"ID":"57ec9c0f-9c30-4c10-afd7-84ac778f9069","Type":"ContainerDied","Data":"770088b7c54e6733444fff8ec81aeea8cdc4b4cfec9e9d8c5075dd5ab2142e82"} Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.777019 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.887147 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjrcl\" (UniqueName: \"kubernetes.io/projected/57ec9c0f-9c30-4c10-afd7-84ac778f9069-kube-api-access-tjrcl\") pod \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.887318 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-migration-ssh-key-0\") pod \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.887344 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-cell1-compute-config-1\") pod \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.887370 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-combined-ca-bundle\") pod \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.887444 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-extra-config-0\") pod \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.887469 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-ssh-key-openstack-edpm-ipam\") pod \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.887643 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-inventory\") pod \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.887681 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-migration-ssh-key-1\") pod \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.887719 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-cell1-compute-config-0\") pod \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\" (UID: \"57ec9c0f-9c30-4c10-afd7-84ac778f9069\") " Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.893311 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57ec9c0f-9c30-4c10-afd7-84ac778f9069-kube-api-access-tjrcl" (OuterVolumeSpecName: "kube-api-access-tjrcl") pod "57ec9c0f-9c30-4c10-afd7-84ac778f9069" (UID: "57ec9c0f-9c30-4c10-afd7-84ac778f9069"). InnerVolumeSpecName "kube-api-access-tjrcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.895139 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "57ec9c0f-9c30-4c10-afd7-84ac778f9069" (UID: "57ec9c0f-9c30-4c10-afd7-84ac778f9069"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.910283 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "57ec9c0f-9c30-4c10-afd7-84ac778f9069" (UID: "57ec9c0f-9c30-4c10-afd7-84ac778f9069"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.911533 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "57ec9c0f-9c30-4c10-afd7-84ac778f9069" (UID: "57ec9c0f-9c30-4c10-afd7-84ac778f9069"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.917033 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "57ec9c0f-9c30-4c10-afd7-84ac778f9069" (UID: "57ec9c0f-9c30-4c10-afd7-84ac778f9069"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.919566 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "57ec9c0f-9c30-4c10-afd7-84ac778f9069" (UID: "57ec9c0f-9c30-4c10-afd7-84ac778f9069"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.921811 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "57ec9c0f-9c30-4c10-afd7-84ac778f9069" (UID: "57ec9c0f-9c30-4c10-afd7-84ac778f9069"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.922282 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-inventory" (OuterVolumeSpecName: "inventory") pod "57ec9c0f-9c30-4c10-afd7-84ac778f9069" (UID: "57ec9c0f-9c30-4c10-afd7-84ac778f9069"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.933038 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "57ec9c0f-9c30-4c10-afd7-84ac778f9069" (UID: "57ec9c0f-9c30-4c10-afd7-84ac778f9069"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.989990 4783 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.990016 4783 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.990026 4783 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.990034 4783 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.990043 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.990053 4783 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.990061 4783 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.990068 4783 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/57ec9c0f-9c30-4c10-afd7-84ac778f9069-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:04 crc kubenswrapper[4783]: I0131 09:37:04.990076 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjrcl\" (UniqueName: \"kubernetes.io/projected/57ec9c0f-9c30-4c10-afd7-84ac778f9069-kube-api-access-tjrcl\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.465685 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" event={"ID":"57ec9c0f-9c30-4c10-afd7-84ac778f9069","Type":"ContainerDied","Data":"38a63270bc9150cf444e008cee84707b29c77bd869b5ce72586e07f20bc3f8d3"} Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.465724 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q6sd4" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.465729 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38a63270bc9150cf444e008cee84707b29c77bd869b5ce72586e07f20bc3f8d3" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.537945 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g"] Jan 31 09:37:05 crc kubenswrapper[4783]: E0131 09:37:05.538441 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65697295-79b1-47af-887f-531827ecc46f" containerName="extract-utilities" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.538459 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="65697295-79b1-47af-887f-531827ecc46f" containerName="extract-utilities" Jan 31 09:37:05 crc kubenswrapper[4783]: E0131 09:37:05.538476 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ec9c0f-9c30-4c10-afd7-84ac778f9069" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.538483 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ec9c0f-9c30-4c10-afd7-84ac778f9069" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 31 09:37:05 crc kubenswrapper[4783]: E0131 09:37:05.538497 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65697295-79b1-47af-887f-531827ecc46f" containerName="extract-content" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.538502 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="65697295-79b1-47af-887f-531827ecc46f" containerName="extract-content" Jan 31 09:37:05 crc kubenswrapper[4783]: E0131 09:37:05.538518 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65697295-79b1-47af-887f-531827ecc46f" containerName="registry-server" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.538522 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="65697295-79b1-47af-887f-531827ecc46f" containerName="registry-server" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.538721 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ec9c0f-9c30-4c10-afd7-84ac778f9069" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.538738 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="65697295-79b1-47af-887f-531827ecc46f" containerName="registry-server" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.539359 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.541918 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.542152 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.542401 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.542543 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jbkw8" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.542653 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.547852 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g"] Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.599730 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.599788 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfpc5\" (UniqueName: \"kubernetes.io/projected/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-kube-api-access-wfpc5\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.599827 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.599845 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.599868 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.599916 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.600015 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.702078 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.702210 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfpc5\" (UniqueName: \"kubernetes.io/projected/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-kube-api-access-wfpc5\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.702282 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.702309 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.702367 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.702436 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.702605 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.706943 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.707011 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.707078 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.707649 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.707963 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.708832 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.717293 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfpc5\" (UniqueName: \"kubernetes.io/projected/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-kube-api-access-wfpc5\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:05 crc kubenswrapper[4783]: I0131 09:37:05.862092 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:37:06 crc kubenswrapper[4783]: I0131 09:37:06.290966 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g"] Jan 31 09:37:06 crc kubenswrapper[4783]: I0131 09:37:06.472522 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" event={"ID":"36f447a7-72aa-465f-8fad-1c0bb7c71a9e","Type":"ContainerStarted","Data":"3b37147436c7bad8d4c9cb91227a89fd2f1e911116eaf90a312434e1aa2427f9"} Jan 31 09:37:07 crc kubenswrapper[4783]: I0131 09:37:07.481214 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" event={"ID":"36f447a7-72aa-465f-8fad-1c0bb7c71a9e","Type":"ContainerStarted","Data":"7a7b35a5b86618da55e097d6ddfd1a7f7c2ba7dd906e060c39d09ae361b16247"} Jan 31 09:37:07 crc kubenswrapper[4783]: I0131 09:37:07.502769 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" podStartSLOduration=2.018881338 podStartE2EDuration="2.502752197s" podCreationTimestamp="2026-01-31 09:37:05 +0000 UTC" firstStartedPulling="2026-01-31 09:37:06.298861864 +0000 UTC m=+1936.967545332" lastFinishedPulling="2026-01-31 09:37:06.782732723 +0000 UTC m=+1937.451416191" observedRunningTime="2026-01-31 09:37:07.502012391 +0000 UTC m=+1938.170695859" watchObservedRunningTime="2026-01-31 09:37:07.502752197 +0000 UTC m=+1938.171435665" Jan 31 09:38:47 crc kubenswrapper[4783]: I0131 09:38:47.756654 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:38:47 crc kubenswrapper[4783]: I0131 09:38:47.757121 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:38:49 crc kubenswrapper[4783]: I0131 09:38:49.178809 4783 generic.go:334] "Generic (PLEG): container finished" podID="36f447a7-72aa-465f-8fad-1c0bb7c71a9e" containerID="7a7b35a5b86618da55e097d6ddfd1a7f7c2ba7dd906e060c39d09ae361b16247" exitCode=0 Jan 31 09:38:49 crc kubenswrapper[4783]: I0131 09:38:49.178876 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" event={"ID":"36f447a7-72aa-465f-8fad-1c0bb7c71a9e","Type":"ContainerDied","Data":"7a7b35a5b86618da55e097d6ddfd1a7f7c2ba7dd906e060c39d09ae361b16247"} Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.481001 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.634937 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-telemetry-combined-ca-bundle\") pod \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.634996 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-0\") pod \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.635136 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-2\") pod \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.635258 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ssh-key-openstack-edpm-ipam\") pod \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.635377 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-1\") pod \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.635398 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfpc5\" (UniqueName: \"kubernetes.io/projected/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-kube-api-access-wfpc5\") pod \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.635451 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-inventory\") pod \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\" (UID: \"36f447a7-72aa-465f-8fad-1c0bb7c71a9e\") " Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.641291 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "36f447a7-72aa-465f-8fad-1c0bb7c71a9e" (UID: "36f447a7-72aa-465f-8fad-1c0bb7c71a9e"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.641367 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-kube-api-access-wfpc5" (OuterVolumeSpecName: "kube-api-access-wfpc5") pod "36f447a7-72aa-465f-8fad-1c0bb7c71a9e" (UID: "36f447a7-72aa-465f-8fad-1c0bb7c71a9e"). InnerVolumeSpecName "kube-api-access-wfpc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.658099 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "36f447a7-72aa-465f-8fad-1c0bb7c71a9e" (UID: "36f447a7-72aa-465f-8fad-1c0bb7c71a9e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.659448 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "36f447a7-72aa-465f-8fad-1c0bb7c71a9e" (UID: "36f447a7-72aa-465f-8fad-1c0bb7c71a9e"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.660082 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-inventory" (OuterVolumeSpecName: "inventory") pod "36f447a7-72aa-465f-8fad-1c0bb7c71a9e" (UID: "36f447a7-72aa-465f-8fad-1c0bb7c71a9e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.660482 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "36f447a7-72aa-465f-8fad-1c0bb7c71a9e" (UID: "36f447a7-72aa-465f-8fad-1c0bb7c71a9e"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.665457 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "36f447a7-72aa-465f-8fad-1c0bb7c71a9e" (UID: "36f447a7-72aa-465f-8fad-1c0bb7c71a9e"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.738007 4783 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.738039 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.738051 4783 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.738063 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfpc5\" (UniqueName: \"kubernetes.io/projected/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-kube-api-access-wfpc5\") on node \"crc\" DevicePath \"\"" Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.738075 4783 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.738086 4783 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:38:50 crc kubenswrapper[4783]: I0131 09:38:50.738095 4783 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/36f447a7-72aa-465f-8fad-1c0bb7c71a9e-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:38:51 crc kubenswrapper[4783]: I0131 09:38:51.198416 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" event={"ID":"36f447a7-72aa-465f-8fad-1c0bb7c71a9e","Type":"ContainerDied","Data":"3b37147436c7bad8d4c9cb91227a89fd2f1e911116eaf90a312434e1aa2427f9"} Jan 31 09:38:51 crc kubenswrapper[4783]: I0131 09:38:51.198459 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b37147436c7bad8d4c9cb91227a89fd2f1e911116eaf90a312434e1aa2427f9" Jan 31 09:38:51 crc kubenswrapper[4783]: I0131 09:38:51.198489 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g" Jan 31 09:39:17 crc kubenswrapper[4783]: I0131 09:39:17.757081 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:39:17 crc kubenswrapper[4783]: I0131 09:39:17.757679 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:39:38 crc kubenswrapper[4783]: I0131 09:39:38.949331 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jan 31 09:39:38 crc kubenswrapper[4783]: E0131 09:39:38.950505 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36f447a7-72aa-465f-8fad-1c0bb7c71a9e" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 31 09:39:38 crc kubenswrapper[4783]: I0131 09:39:38.950522 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="36f447a7-72aa-465f-8fad-1c0bb7c71a9e" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 31 09:39:38 crc kubenswrapper[4783]: I0131 09:39:38.950717 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="36f447a7-72aa-465f-8fad-1c0bb7c71a9e" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 31 09:39:38 crc kubenswrapper[4783]: I0131 09:39:38.951416 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 31 09:39:38 crc kubenswrapper[4783]: I0131 09:39:38.954768 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6f9wl" Jan 31 09:39:38 crc kubenswrapper[4783]: I0131 09:39:38.954920 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 31 09:39:38 crc kubenswrapper[4783]: I0131 09:39:38.954969 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 31 09:39:38 crc kubenswrapper[4783]: I0131 09:39:38.955083 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 31 09:39:38 crc kubenswrapper[4783]: I0131 09:39:38.964685 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.044787 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16292449-5a29-426e-aa57-18e752dd60f6-config-data\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.044959 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.045015 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/16292449-5a29-426e-aa57-18e752dd60f6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.147465 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16292449-5a29-426e-aa57-18e752dd60f6-config-data\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.147552 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/16292449-5a29-426e-aa57-18e752dd60f6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.147586 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/16292449-5a29-426e-aa57-18e752dd60f6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.147614 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.147641 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.147681 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/16292449-5a29-426e-aa57-18e752dd60f6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.147929 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqq4d\" (UniqueName: \"kubernetes.io/projected/16292449-5a29-426e-aa57-18e752dd60f6-kube-api-access-bqq4d\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.148134 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.148272 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.148867 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/16292449-5a29-426e-aa57-18e752dd60f6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.149137 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16292449-5a29-426e-aa57-18e752dd60f6-config-data\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.154984 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.250834 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.251214 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.251273 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/16292449-5a29-426e-aa57-18e752dd60f6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.251310 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/16292449-5a29-426e-aa57-18e752dd60f6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.251340 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.251400 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqq4d\" (UniqueName: \"kubernetes.io/projected/16292449-5a29-426e-aa57-18e752dd60f6-kube-api-access-bqq4d\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.251997 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/16292449-5a29-426e-aa57-18e752dd60f6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.252054 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.252266 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/16292449-5a29-426e-aa57-18e752dd60f6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.256507 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.256565 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.265963 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqq4d\" (UniqueName: \"kubernetes.io/projected/16292449-5a29-426e-aa57-18e752dd60f6-kube-api-access-bqq4d\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.273997 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " pod="openstack/tempest-tests-tempest" Jan 31 09:39:39 crc kubenswrapper[4783]: I0131 09:39:39.570479 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 31 09:39:40 crc kubenswrapper[4783]: I0131 09:39:40.033862 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 31 09:39:40 crc kubenswrapper[4783]: I0131 09:39:40.605881 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"16292449-5a29-426e-aa57-18e752dd60f6","Type":"ContainerStarted","Data":"b818a9958edc8dd84b6997ca42ed80cf9d041806c0bd3daabe9f70708b9f0713"} Jan 31 09:39:47 crc kubenswrapper[4783]: I0131 09:39:47.757124 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:39:47 crc kubenswrapper[4783]: I0131 09:39:47.757664 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:39:47 crc kubenswrapper[4783]: I0131 09:39:47.757728 4783 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:39:47 crc kubenswrapper[4783]: I0131 09:39:47.758340 4783 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7e1f687b072d842405d8baf7e29e05dc4192665ce37624bd66d9efc7fe830200"} pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:39:47 crc kubenswrapper[4783]: I0131 09:39:47.758405 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" containerID="cri-o://7e1f687b072d842405d8baf7e29e05dc4192665ce37624bd66d9efc7fe830200" gracePeriod=600 Jan 31 09:39:48 crc kubenswrapper[4783]: I0131 09:39:48.690476 4783 generic.go:334] "Generic (PLEG): container finished" podID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerID="7e1f687b072d842405d8baf7e29e05dc4192665ce37624bd66d9efc7fe830200" exitCode=0 Jan 31 09:39:48 crc kubenswrapper[4783]: I0131 09:39:48.690546 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerDied","Data":"7e1f687b072d842405d8baf7e29e05dc4192665ce37624bd66d9efc7fe830200"} Jan 31 09:39:48 crc kubenswrapper[4783]: I0131 09:39:48.691017 4783 scope.go:117] "RemoveContainer" containerID="8ba1b54cac3bfe7de1f7827500e06985757a655e8a5e48cf33f52798d9eee13b" Jan 31 09:39:50 crc kubenswrapper[4783]: I0131 09:39:50.716537 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerStarted","Data":"6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9"} Jan 31 09:40:02 crc kubenswrapper[4783]: I0131 09:40:02.336078 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bwcr9"] Jan 31 09:40:02 crc kubenswrapper[4783]: I0131 09:40:02.338881 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:02 crc kubenswrapper[4783]: I0131 09:40:02.349554 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bwcr9"] Jan 31 09:40:02 crc kubenswrapper[4783]: I0131 09:40:02.493131 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhqjh\" (UniqueName: \"kubernetes.io/projected/aafe23ef-6d92-49dc-b311-d85dee42bf0f-kube-api-access-fhqjh\") pod \"redhat-marketplace-bwcr9\" (UID: \"aafe23ef-6d92-49dc-b311-d85dee42bf0f\") " pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:02 crc kubenswrapper[4783]: I0131 09:40:02.493273 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aafe23ef-6d92-49dc-b311-d85dee42bf0f-utilities\") pod \"redhat-marketplace-bwcr9\" (UID: \"aafe23ef-6d92-49dc-b311-d85dee42bf0f\") " pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:02 crc kubenswrapper[4783]: I0131 09:40:02.493319 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aafe23ef-6d92-49dc-b311-d85dee42bf0f-catalog-content\") pod \"redhat-marketplace-bwcr9\" (UID: \"aafe23ef-6d92-49dc-b311-d85dee42bf0f\") " pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:02 crc kubenswrapper[4783]: I0131 09:40:02.595985 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aafe23ef-6d92-49dc-b311-d85dee42bf0f-utilities\") pod \"redhat-marketplace-bwcr9\" (UID: \"aafe23ef-6d92-49dc-b311-d85dee42bf0f\") " pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:02 crc kubenswrapper[4783]: I0131 09:40:02.596069 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aafe23ef-6d92-49dc-b311-d85dee42bf0f-catalog-content\") pod \"redhat-marketplace-bwcr9\" (UID: \"aafe23ef-6d92-49dc-b311-d85dee42bf0f\") " pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:02 crc kubenswrapper[4783]: I0131 09:40:02.596201 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhqjh\" (UniqueName: \"kubernetes.io/projected/aafe23ef-6d92-49dc-b311-d85dee42bf0f-kube-api-access-fhqjh\") pod \"redhat-marketplace-bwcr9\" (UID: \"aafe23ef-6d92-49dc-b311-d85dee42bf0f\") " pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:02 crc kubenswrapper[4783]: I0131 09:40:02.596842 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aafe23ef-6d92-49dc-b311-d85dee42bf0f-utilities\") pod \"redhat-marketplace-bwcr9\" (UID: \"aafe23ef-6d92-49dc-b311-d85dee42bf0f\") " pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:02 crc kubenswrapper[4783]: I0131 09:40:02.596934 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aafe23ef-6d92-49dc-b311-d85dee42bf0f-catalog-content\") pod \"redhat-marketplace-bwcr9\" (UID: \"aafe23ef-6d92-49dc-b311-d85dee42bf0f\") " pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:02 crc kubenswrapper[4783]: I0131 09:40:02.617235 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhqjh\" (UniqueName: \"kubernetes.io/projected/aafe23ef-6d92-49dc-b311-d85dee42bf0f-kube-api-access-fhqjh\") pod \"redhat-marketplace-bwcr9\" (UID: \"aafe23ef-6d92-49dc-b311-d85dee42bf0f\") " pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:02 crc kubenswrapper[4783]: I0131 09:40:02.660963 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:13 crc kubenswrapper[4783]: E0131 09:40:13.724224 4783 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Jan 31 09:40:13 crc kubenswrapper[4783]: E0131 09:40:13.724824 4783 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bqq4d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(16292449-5a29-426e-aa57-18e752dd60f6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 31 09:40:13 crc kubenswrapper[4783]: E0131 09:40:13.726415 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="16292449-5a29-426e-aa57-18e752dd60f6" Jan 31 09:40:13 crc kubenswrapper[4783]: E0131 09:40:13.934669 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="16292449-5a29-426e-aa57-18e752dd60f6" Jan 31 09:40:14 crc kubenswrapper[4783]: I0131 09:40:14.143312 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bwcr9"] Jan 31 09:40:14 crc kubenswrapper[4783]: W0131 09:40:14.143986 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaafe23ef_6d92_49dc_b311_d85dee42bf0f.slice/crio-e9fb5268b3ed9b26c3dcb2cc5cf5e6fbca2a7a493be7f7a353b7a9f6cb11fd3f WatchSource:0}: Error finding container e9fb5268b3ed9b26c3dcb2cc5cf5e6fbca2a7a493be7f7a353b7a9f6cb11fd3f: Status 404 returned error can't find the container with id e9fb5268b3ed9b26c3dcb2cc5cf5e6fbca2a7a493be7f7a353b7a9f6cb11fd3f Jan 31 09:40:14 crc kubenswrapper[4783]: I0131 09:40:14.940944 4783 generic.go:334] "Generic (PLEG): container finished" podID="aafe23ef-6d92-49dc-b311-d85dee42bf0f" containerID="03c37eaac722f75c824007ef74c52827b1ffc2ddabb2eb13f40e06dbfa7c00b9" exitCode=0 Jan 31 09:40:14 crc kubenswrapper[4783]: I0131 09:40:14.941040 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bwcr9" event={"ID":"aafe23ef-6d92-49dc-b311-d85dee42bf0f","Type":"ContainerDied","Data":"03c37eaac722f75c824007ef74c52827b1ffc2ddabb2eb13f40e06dbfa7c00b9"} Jan 31 09:40:14 crc kubenswrapper[4783]: I0131 09:40:14.941265 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bwcr9" event={"ID":"aafe23ef-6d92-49dc-b311-d85dee42bf0f","Type":"ContainerStarted","Data":"e9fb5268b3ed9b26c3dcb2cc5cf5e6fbca2a7a493be7f7a353b7a9f6cb11fd3f"} Jan 31 09:40:16 crc kubenswrapper[4783]: I0131 09:40:16.959918 4783 generic.go:334] "Generic (PLEG): container finished" podID="aafe23ef-6d92-49dc-b311-d85dee42bf0f" containerID="47f060e6f98f28c2f21f97f698ab8a333edfca808446b2f20bd5602afa6fab24" exitCode=0 Jan 31 09:40:16 crc kubenswrapper[4783]: I0131 09:40:16.960196 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bwcr9" event={"ID":"aafe23ef-6d92-49dc-b311-d85dee42bf0f","Type":"ContainerDied","Data":"47f060e6f98f28c2f21f97f698ab8a333edfca808446b2f20bd5602afa6fab24"} Jan 31 09:40:17 crc kubenswrapper[4783]: I0131 09:40:17.978192 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bwcr9" event={"ID":"aafe23ef-6d92-49dc-b311-d85dee42bf0f","Type":"ContainerStarted","Data":"30e03cd9658b7281e9f7b3b86a1bd5700df9fb4c20ce36000f01c79b33ec13f5"} Jan 31 09:40:17 crc kubenswrapper[4783]: I0131 09:40:17.995949 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bwcr9" podStartSLOduration=13.52286998 podStartE2EDuration="15.995933806s" podCreationTimestamp="2026-01-31 09:40:02 +0000 UTC" firstStartedPulling="2026-01-31 09:40:14.942217924 +0000 UTC m=+2125.610901392" lastFinishedPulling="2026-01-31 09:40:17.41528175 +0000 UTC m=+2128.083965218" observedRunningTime="2026-01-31 09:40:17.993373438 +0000 UTC m=+2128.662056905" watchObservedRunningTime="2026-01-31 09:40:17.995933806 +0000 UTC m=+2128.664617274" Jan 31 09:40:22 crc kubenswrapper[4783]: I0131 09:40:22.661104 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:22 crc kubenswrapper[4783]: I0131 09:40:22.661839 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:22 crc kubenswrapper[4783]: I0131 09:40:22.703771 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:23 crc kubenswrapper[4783]: I0131 09:40:23.058247 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:23 crc kubenswrapper[4783]: I0131 09:40:23.111551 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bwcr9"] Jan 31 09:40:25 crc kubenswrapper[4783]: I0131 09:40:25.038847 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bwcr9" podUID="aafe23ef-6d92-49dc-b311-d85dee42bf0f" containerName="registry-server" containerID="cri-o://30e03cd9658b7281e9f7b3b86a1bd5700df9fb4c20ce36000f01c79b33ec13f5" gracePeriod=2 Jan 31 09:40:25 crc kubenswrapper[4783]: I0131 09:40:25.532433 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:25 crc kubenswrapper[4783]: I0131 09:40:25.725229 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhqjh\" (UniqueName: \"kubernetes.io/projected/aafe23ef-6d92-49dc-b311-d85dee42bf0f-kube-api-access-fhqjh\") pod \"aafe23ef-6d92-49dc-b311-d85dee42bf0f\" (UID: \"aafe23ef-6d92-49dc-b311-d85dee42bf0f\") " Jan 31 09:40:25 crc kubenswrapper[4783]: I0131 09:40:25.725403 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aafe23ef-6d92-49dc-b311-d85dee42bf0f-utilities\") pod \"aafe23ef-6d92-49dc-b311-d85dee42bf0f\" (UID: \"aafe23ef-6d92-49dc-b311-d85dee42bf0f\") " Jan 31 09:40:25 crc kubenswrapper[4783]: I0131 09:40:25.725458 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aafe23ef-6d92-49dc-b311-d85dee42bf0f-catalog-content\") pod \"aafe23ef-6d92-49dc-b311-d85dee42bf0f\" (UID: \"aafe23ef-6d92-49dc-b311-d85dee42bf0f\") " Jan 31 09:40:25 crc kubenswrapper[4783]: I0131 09:40:25.726287 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aafe23ef-6d92-49dc-b311-d85dee42bf0f-utilities" (OuterVolumeSpecName: "utilities") pod "aafe23ef-6d92-49dc-b311-d85dee42bf0f" (UID: "aafe23ef-6d92-49dc-b311-d85dee42bf0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:40:25 crc kubenswrapper[4783]: I0131 09:40:25.732420 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aafe23ef-6d92-49dc-b311-d85dee42bf0f-kube-api-access-fhqjh" (OuterVolumeSpecName: "kube-api-access-fhqjh") pod "aafe23ef-6d92-49dc-b311-d85dee42bf0f" (UID: "aafe23ef-6d92-49dc-b311-d85dee42bf0f"). InnerVolumeSpecName "kube-api-access-fhqjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:40:25 crc kubenswrapper[4783]: I0131 09:40:25.743930 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aafe23ef-6d92-49dc-b311-d85dee42bf0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aafe23ef-6d92-49dc-b311-d85dee42bf0f" (UID: "aafe23ef-6d92-49dc-b311-d85dee42bf0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:40:25 crc kubenswrapper[4783]: I0131 09:40:25.828808 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhqjh\" (UniqueName: \"kubernetes.io/projected/aafe23ef-6d92-49dc-b311-d85dee42bf0f-kube-api-access-fhqjh\") on node \"crc\" DevicePath \"\"" Jan 31 09:40:25 crc kubenswrapper[4783]: I0131 09:40:25.828844 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aafe23ef-6d92-49dc-b311-d85dee42bf0f-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:40:25 crc kubenswrapper[4783]: I0131 09:40:25.828857 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aafe23ef-6d92-49dc-b311-d85dee42bf0f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:40:26 crc kubenswrapper[4783]: I0131 09:40:26.051843 4783 generic.go:334] "Generic (PLEG): container finished" podID="aafe23ef-6d92-49dc-b311-d85dee42bf0f" containerID="30e03cd9658b7281e9f7b3b86a1bd5700df9fb4c20ce36000f01c79b33ec13f5" exitCode=0 Jan 31 09:40:26 crc kubenswrapper[4783]: I0131 09:40:26.051900 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bwcr9" event={"ID":"aafe23ef-6d92-49dc-b311-d85dee42bf0f","Type":"ContainerDied","Data":"30e03cd9658b7281e9f7b3b86a1bd5700df9fb4c20ce36000f01c79b33ec13f5"} Jan 31 09:40:26 crc kubenswrapper[4783]: I0131 09:40:26.051935 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bwcr9" event={"ID":"aafe23ef-6d92-49dc-b311-d85dee42bf0f","Type":"ContainerDied","Data":"e9fb5268b3ed9b26c3dcb2cc5cf5e6fbca2a7a493be7f7a353b7a9f6cb11fd3f"} Jan 31 09:40:26 crc kubenswrapper[4783]: I0131 09:40:26.051955 4783 scope.go:117] "RemoveContainer" containerID="30e03cd9658b7281e9f7b3b86a1bd5700df9fb4c20ce36000f01c79b33ec13f5" Jan 31 09:40:26 crc kubenswrapper[4783]: I0131 09:40:26.052018 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bwcr9" Jan 31 09:40:26 crc kubenswrapper[4783]: I0131 09:40:26.081846 4783 scope.go:117] "RemoveContainer" containerID="47f060e6f98f28c2f21f97f698ab8a333edfca808446b2f20bd5602afa6fab24" Jan 31 09:40:26 crc kubenswrapper[4783]: I0131 09:40:26.084682 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bwcr9"] Jan 31 09:40:26 crc kubenswrapper[4783]: I0131 09:40:26.090026 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bwcr9"] Jan 31 09:40:26 crc kubenswrapper[4783]: I0131 09:40:26.103247 4783 scope.go:117] "RemoveContainer" containerID="03c37eaac722f75c824007ef74c52827b1ffc2ddabb2eb13f40e06dbfa7c00b9" Jan 31 09:40:26 crc kubenswrapper[4783]: I0131 09:40:26.141050 4783 scope.go:117] "RemoveContainer" containerID="30e03cd9658b7281e9f7b3b86a1bd5700df9fb4c20ce36000f01c79b33ec13f5" Jan 31 09:40:26 crc kubenswrapper[4783]: E0131 09:40:26.141642 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30e03cd9658b7281e9f7b3b86a1bd5700df9fb4c20ce36000f01c79b33ec13f5\": container with ID starting with 30e03cd9658b7281e9f7b3b86a1bd5700df9fb4c20ce36000f01c79b33ec13f5 not found: ID does not exist" containerID="30e03cd9658b7281e9f7b3b86a1bd5700df9fb4c20ce36000f01c79b33ec13f5" Jan 31 09:40:26 crc kubenswrapper[4783]: I0131 09:40:26.141752 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30e03cd9658b7281e9f7b3b86a1bd5700df9fb4c20ce36000f01c79b33ec13f5"} err="failed to get container status \"30e03cd9658b7281e9f7b3b86a1bd5700df9fb4c20ce36000f01c79b33ec13f5\": rpc error: code = NotFound desc = could not find container \"30e03cd9658b7281e9f7b3b86a1bd5700df9fb4c20ce36000f01c79b33ec13f5\": container with ID starting with 30e03cd9658b7281e9f7b3b86a1bd5700df9fb4c20ce36000f01c79b33ec13f5 not found: ID does not exist" Jan 31 09:40:26 crc kubenswrapper[4783]: I0131 09:40:26.141842 4783 scope.go:117] "RemoveContainer" containerID="47f060e6f98f28c2f21f97f698ab8a333edfca808446b2f20bd5602afa6fab24" Jan 31 09:40:26 crc kubenswrapper[4783]: E0131 09:40:26.142358 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f060e6f98f28c2f21f97f698ab8a333edfca808446b2f20bd5602afa6fab24\": container with ID starting with 47f060e6f98f28c2f21f97f698ab8a333edfca808446b2f20bd5602afa6fab24 not found: ID does not exist" containerID="47f060e6f98f28c2f21f97f698ab8a333edfca808446b2f20bd5602afa6fab24" Jan 31 09:40:26 crc kubenswrapper[4783]: I0131 09:40:26.142455 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f060e6f98f28c2f21f97f698ab8a333edfca808446b2f20bd5602afa6fab24"} err="failed to get container status \"47f060e6f98f28c2f21f97f698ab8a333edfca808446b2f20bd5602afa6fab24\": rpc error: code = NotFound desc = could not find container \"47f060e6f98f28c2f21f97f698ab8a333edfca808446b2f20bd5602afa6fab24\": container with ID starting with 47f060e6f98f28c2f21f97f698ab8a333edfca808446b2f20bd5602afa6fab24 not found: ID does not exist" Jan 31 09:40:26 crc kubenswrapper[4783]: I0131 09:40:26.142532 4783 scope.go:117] "RemoveContainer" containerID="03c37eaac722f75c824007ef74c52827b1ffc2ddabb2eb13f40e06dbfa7c00b9" Jan 31 09:40:26 crc kubenswrapper[4783]: E0131 09:40:26.142867 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03c37eaac722f75c824007ef74c52827b1ffc2ddabb2eb13f40e06dbfa7c00b9\": container with ID starting with 03c37eaac722f75c824007ef74c52827b1ffc2ddabb2eb13f40e06dbfa7c00b9 not found: ID does not exist" containerID="03c37eaac722f75c824007ef74c52827b1ffc2ddabb2eb13f40e06dbfa7c00b9" Jan 31 09:40:26 crc kubenswrapper[4783]: I0131 09:40:26.142955 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c37eaac722f75c824007ef74c52827b1ffc2ddabb2eb13f40e06dbfa7c00b9"} err="failed to get container status \"03c37eaac722f75c824007ef74c52827b1ffc2ddabb2eb13f40e06dbfa7c00b9\": rpc error: code = NotFound desc = could not find container \"03c37eaac722f75c824007ef74c52827b1ffc2ddabb2eb13f40e06dbfa7c00b9\": container with ID starting with 03c37eaac722f75c824007ef74c52827b1ffc2ddabb2eb13f40e06dbfa7c00b9 not found: ID does not exist" Jan 31 09:40:27 crc kubenswrapper[4783]: I0131 09:40:27.654150 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aafe23ef-6d92-49dc-b311-d85dee42bf0f" path="/var/lib/kubelet/pods/aafe23ef-6d92-49dc-b311-d85dee42bf0f/volumes" Jan 31 09:40:29 crc kubenswrapper[4783]: I0131 09:40:29.656360 4783 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:40:30 crc kubenswrapper[4783]: I0131 09:40:30.191645 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 31 09:40:31 crc kubenswrapper[4783]: I0131 09:40:31.106820 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"16292449-5a29-426e-aa57-18e752dd60f6","Type":"ContainerStarted","Data":"b79ff5f5df22d1e0f297e28214d49ee54bd81eb75693e44846c1afd5b475d2d8"} Jan 31 09:40:41 crc kubenswrapper[4783]: I0131 09:40:41.868234 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=14.717774373 podStartE2EDuration="1m4.868218147s" podCreationTimestamp="2026-01-31 09:39:37 +0000 UTC" firstStartedPulling="2026-01-31 09:39:40.037981277 +0000 UTC m=+2090.706664734" lastFinishedPulling="2026-01-31 09:40:30.18842504 +0000 UTC m=+2140.857108508" observedRunningTime="2026-01-31 09:40:31.129428828 +0000 UTC m=+2141.798112295" watchObservedRunningTime="2026-01-31 09:40:41.868218147 +0000 UTC m=+2152.536901614" Jan 31 09:40:41 crc kubenswrapper[4783]: I0131 09:40:41.872807 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kbf4j"] Jan 31 09:40:41 crc kubenswrapper[4783]: E0131 09:40:41.873119 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aafe23ef-6d92-49dc-b311-d85dee42bf0f" containerName="registry-server" Jan 31 09:40:41 crc kubenswrapper[4783]: I0131 09:40:41.873134 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="aafe23ef-6d92-49dc-b311-d85dee42bf0f" containerName="registry-server" Jan 31 09:40:41 crc kubenswrapper[4783]: E0131 09:40:41.873156 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aafe23ef-6d92-49dc-b311-d85dee42bf0f" containerName="extract-utilities" Jan 31 09:40:41 crc kubenswrapper[4783]: I0131 09:40:41.873182 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="aafe23ef-6d92-49dc-b311-d85dee42bf0f" containerName="extract-utilities" Jan 31 09:40:41 crc kubenswrapper[4783]: E0131 09:40:41.873201 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aafe23ef-6d92-49dc-b311-d85dee42bf0f" containerName="extract-content" Jan 31 09:40:41 crc kubenswrapper[4783]: I0131 09:40:41.873206 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="aafe23ef-6d92-49dc-b311-d85dee42bf0f" containerName="extract-content" Jan 31 09:40:41 crc kubenswrapper[4783]: I0131 09:40:41.873355 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="aafe23ef-6d92-49dc-b311-d85dee42bf0f" containerName="registry-server" Jan 31 09:40:41 crc kubenswrapper[4783]: I0131 09:40:41.874499 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:41 crc kubenswrapper[4783]: I0131 09:40:41.887004 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kbf4j"] Jan 31 09:40:41 crc kubenswrapper[4783]: I0131 09:40:41.989903 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nws44\" (UniqueName: \"kubernetes.io/projected/426da86a-4794-4521-ba4b-0ee6549e4cf1-kube-api-access-nws44\") pod \"redhat-operators-kbf4j\" (UID: \"426da86a-4794-4521-ba4b-0ee6549e4cf1\") " pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:41 crc kubenswrapper[4783]: I0131 09:40:41.990066 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/426da86a-4794-4521-ba4b-0ee6549e4cf1-utilities\") pod \"redhat-operators-kbf4j\" (UID: \"426da86a-4794-4521-ba4b-0ee6549e4cf1\") " pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:41 crc kubenswrapper[4783]: I0131 09:40:41.990111 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/426da86a-4794-4521-ba4b-0ee6549e4cf1-catalog-content\") pod \"redhat-operators-kbf4j\" (UID: \"426da86a-4794-4521-ba4b-0ee6549e4cf1\") " pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:42 crc kubenswrapper[4783]: I0131 09:40:42.092627 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/426da86a-4794-4521-ba4b-0ee6549e4cf1-utilities\") pod \"redhat-operators-kbf4j\" (UID: \"426da86a-4794-4521-ba4b-0ee6549e4cf1\") " pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:42 crc kubenswrapper[4783]: I0131 09:40:42.092708 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/426da86a-4794-4521-ba4b-0ee6549e4cf1-catalog-content\") pod \"redhat-operators-kbf4j\" (UID: \"426da86a-4794-4521-ba4b-0ee6549e4cf1\") " pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:42 crc kubenswrapper[4783]: I0131 09:40:42.092947 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nws44\" (UniqueName: \"kubernetes.io/projected/426da86a-4794-4521-ba4b-0ee6549e4cf1-kube-api-access-nws44\") pod \"redhat-operators-kbf4j\" (UID: \"426da86a-4794-4521-ba4b-0ee6549e4cf1\") " pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:42 crc kubenswrapper[4783]: I0131 09:40:42.093225 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/426da86a-4794-4521-ba4b-0ee6549e4cf1-utilities\") pod \"redhat-operators-kbf4j\" (UID: \"426da86a-4794-4521-ba4b-0ee6549e4cf1\") " pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:42 crc kubenswrapper[4783]: I0131 09:40:42.093622 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/426da86a-4794-4521-ba4b-0ee6549e4cf1-catalog-content\") pod \"redhat-operators-kbf4j\" (UID: \"426da86a-4794-4521-ba4b-0ee6549e4cf1\") " pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:42 crc kubenswrapper[4783]: I0131 09:40:42.114942 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nws44\" (UniqueName: \"kubernetes.io/projected/426da86a-4794-4521-ba4b-0ee6549e4cf1-kube-api-access-nws44\") pod \"redhat-operators-kbf4j\" (UID: \"426da86a-4794-4521-ba4b-0ee6549e4cf1\") " pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:42 crc kubenswrapper[4783]: I0131 09:40:42.190792 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:42 crc kubenswrapper[4783]: I0131 09:40:42.672536 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kbf4j"] Jan 31 09:40:43 crc kubenswrapper[4783]: I0131 09:40:43.224260 4783 generic.go:334] "Generic (PLEG): container finished" podID="426da86a-4794-4521-ba4b-0ee6549e4cf1" containerID="983f98a77b44b150f7e5180babe989c0b4f1ae66690f30c49caa1d1ec21aeff1" exitCode=0 Jan 31 09:40:43 crc kubenswrapper[4783]: I0131 09:40:43.224442 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbf4j" event={"ID":"426da86a-4794-4521-ba4b-0ee6549e4cf1","Type":"ContainerDied","Data":"983f98a77b44b150f7e5180babe989c0b4f1ae66690f30c49caa1d1ec21aeff1"} Jan 31 09:40:43 crc kubenswrapper[4783]: I0131 09:40:43.224555 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbf4j" event={"ID":"426da86a-4794-4521-ba4b-0ee6549e4cf1","Type":"ContainerStarted","Data":"9bd01a25f4944995c25e6f53395eb919e2e470bdbfa556d2bc161c234857d9b6"} Jan 31 09:40:44 crc kubenswrapper[4783]: I0131 09:40:44.232441 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbf4j" event={"ID":"426da86a-4794-4521-ba4b-0ee6549e4cf1","Type":"ContainerStarted","Data":"2136d7a66a44989e9f90467f212896ec4492391cca8056bd91aa99ae1809b242"} Jan 31 09:40:45 crc kubenswrapper[4783]: I0131 09:40:45.242638 4783 generic.go:334] "Generic (PLEG): container finished" podID="426da86a-4794-4521-ba4b-0ee6549e4cf1" containerID="2136d7a66a44989e9f90467f212896ec4492391cca8056bd91aa99ae1809b242" exitCode=0 Jan 31 09:40:45 crc kubenswrapper[4783]: I0131 09:40:45.242752 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbf4j" event={"ID":"426da86a-4794-4521-ba4b-0ee6549e4cf1","Type":"ContainerDied","Data":"2136d7a66a44989e9f90467f212896ec4492391cca8056bd91aa99ae1809b242"} Jan 31 09:40:46 crc kubenswrapper[4783]: I0131 09:40:46.256272 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbf4j" event={"ID":"426da86a-4794-4521-ba4b-0ee6549e4cf1","Type":"ContainerStarted","Data":"aaed137ebcc63ee3c478aa90f9cc0e50acb8e872d483078e3c38e99e17cc13a3"} Jan 31 09:40:46 crc kubenswrapper[4783]: I0131 09:40:46.278236 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kbf4j" podStartSLOduration=2.77965065 podStartE2EDuration="5.278220528s" podCreationTimestamp="2026-01-31 09:40:41 +0000 UTC" firstStartedPulling="2026-01-31 09:40:43.225851976 +0000 UTC m=+2153.894535444" lastFinishedPulling="2026-01-31 09:40:45.724421854 +0000 UTC m=+2156.393105322" observedRunningTime="2026-01-31 09:40:46.272149341 +0000 UTC m=+2156.940832809" watchObservedRunningTime="2026-01-31 09:40:46.278220528 +0000 UTC m=+2156.946903996" Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.191546 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.192316 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.233231 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.341801 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.649475 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5t9s8"] Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.658286 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5t9s8"] Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.658379 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.721776 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa7114c-d950-4737-bfd5-9d45fc386e5c-utilities\") pod \"certified-operators-5t9s8\" (UID: \"faa7114c-d950-4737-bfd5-9d45fc386e5c\") " pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.721839 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44nzn\" (UniqueName: \"kubernetes.io/projected/faa7114c-d950-4737-bfd5-9d45fc386e5c-kube-api-access-44nzn\") pod \"certified-operators-5t9s8\" (UID: \"faa7114c-d950-4737-bfd5-9d45fc386e5c\") " pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.721870 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa7114c-d950-4737-bfd5-9d45fc386e5c-catalog-content\") pod \"certified-operators-5t9s8\" (UID: \"faa7114c-d950-4737-bfd5-9d45fc386e5c\") " pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.824452 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa7114c-d950-4737-bfd5-9d45fc386e5c-utilities\") pod \"certified-operators-5t9s8\" (UID: \"faa7114c-d950-4737-bfd5-9d45fc386e5c\") " pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.824506 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44nzn\" (UniqueName: \"kubernetes.io/projected/faa7114c-d950-4737-bfd5-9d45fc386e5c-kube-api-access-44nzn\") pod \"certified-operators-5t9s8\" (UID: \"faa7114c-d950-4737-bfd5-9d45fc386e5c\") " pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.824549 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa7114c-d950-4737-bfd5-9d45fc386e5c-catalog-content\") pod \"certified-operators-5t9s8\" (UID: \"faa7114c-d950-4737-bfd5-9d45fc386e5c\") " pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.825273 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa7114c-d950-4737-bfd5-9d45fc386e5c-catalog-content\") pod \"certified-operators-5t9s8\" (UID: \"faa7114c-d950-4737-bfd5-9d45fc386e5c\") " pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.825522 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa7114c-d950-4737-bfd5-9d45fc386e5c-utilities\") pod \"certified-operators-5t9s8\" (UID: \"faa7114c-d950-4737-bfd5-9d45fc386e5c\") " pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.852366 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kbf4j"] Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.853022 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44nzn\" (UniqueName: \"kubernetes.io/projected/faa7114c-d950-4737-bfd5-9d45fc386e5c-kube-api-access-44nzn\") pod \"certified-operators-5t9s8\" (UID: \"faa7114c-d950-4737-bfd5-9d45fc386e5c\") " pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:40:52 crc kubenswrapper[4783]: I0131 09:40:52.981962 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:40:53 crc kubenswrapper[4783]: I0131 09:40:53.474328 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5t9s8"] Jan 31 09:40:54 crc kubenswrapper[4783]: I0131 09:40:54.350272 4783 generic.go:334] "Generic (PLEG): container finished" podID="faa7114c-d950-4737-bfd5-9d45fc386e5c" containerID="f78b332a974852861894a6355424a725a336363aefc8fee52d0eb204e20044f6" exitCode=0 Jan 31 09:40:54 crc kubenswrapper[4783]: I0131 09:40:54.351279 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kbf4j" podUID="426da86a-4794-4521-ba4b-0ee6549e4cf1" containerName="registry-server" containerID="cri-o://aaed137ebcc63ee3c478aa90f9cc0e50acb8e872d483078e3c38e99e17cc13a3" gracePeriod=2 Jan 31 09:40:54 crc kubenswrapper[4783]: I0131 09:40:54.350375 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5t9s8" event={"ID":"faa7114c-d950-4737-bfd5-9d45fc386e5c","Type":"ContainerDied","Data":"f78b332a974852861894a6355424a725a336363aefc8fee52d0eb204e20044f6"} Jan 31 09:40:54 crc kubenswrapper[4783]: I0131 09:40:54.351398 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5t9s8" event={"ID":"faa7114c-d950-4737-bfd5-9d45fc386e5c","Type":"ContainerStarted","Data":"212797a8c47a5e28f5784d91eda31d5fb4f60fca22c8bfce7bee72243f3e7a16"} Jan 31 09:40:54 crc kubenswrapper[4783]: I0131 09:40:54.767042 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:54 crc kubenswrapper[4783]: I0131 09:40:54.769929 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/426da86a-4794-4521-ba4b-0ee6549e4cf1-utilities\") pod \"426da86a-4794-4521-ba4b-0ee6549e4cf1\" (UID: \"426da86a-4794-4521-ba4b-0ee6549e4cf1\") " Jan 31 09:40:54 crc kubenswrapper[4783]: I0131 09:40:54.770029 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/426da86a-4794-4521-ba4b-0ee6549e4cf1-catalog-content\") pod \"426da86a-4794-4521-ba4b-0ee6549e4cf1\" (UID: \"426da86a-4794-4521-ba4b-0ee6549e4cf1\") " Jan 31 09:40:54 crc kubenswrapper[4783]: I0131 09:40:54.770180 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nws44\" (UniqueName: \"kubernetes.io/projected/426da86a-4794-4521-ba4b-0ee6549e4cf1-kube-api-access-nws44\") pod \"426da86a-4794-4521-ba4b-0ee6549e4cf1\" (UID: \"426da86a-4794-4521-ba4b-0ee6549e4cf1\") " Jan 31 09:40:54 crc kubenswrapper[4783]: I0131 09:40:54.770650 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426da86a-4794-4521-ba4b-0ee6549e4cf1-utilities" (OuterVolumeSpecName: "utilities") pod "426da86a-4794-4521-ba4b-0ee6549e4cf1" (UID: "426da86a-4794-4521-ba4b-0ee6549e4cf1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:40:54 crc kubenswrapper[4783]: I0131 09:40:54.775487 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/426da86a-4794-4521-ba4b-0ee6549e4cf1-kube-api-access-nws44" (OuterVolumeSpecName: "kube-api-access-nws44") pod "426da86a-4794-4521-ba4b-0ee6549e4cf1" (UID: "426da86a-4794-4521-ba4b-0ee6549e4cf1"). InnerVolumeSpecName "kube-api-access-nws44". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:40:54 crc kubenswrapper[4783]: I0131 09:40:54.862990 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426da86a-4794-4521-ba4b-0ee6549e4cf1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "426da86a-4794-4521-ba4b-0ee6549e4cf1" (UID: "426da86a-4794-4521-ba4b-0ee6549e4cf1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:40:54 crc kubenswrapper[4783]: I0131 09:40:54.873749 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/426da86a-4794-4521-ba4b-0ee6549e4cf1-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:40:54 crc kubenswrapper[4783]: I0131 09:40:54.873788 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/426da86a-4794-4521-ba4b-0ee6549e4cf1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:40:54 crc kubenswrapper[4783]: I0131 09:40:54.873817 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nws44\" (UniqueName: \"kubernetes.io/projected/426da86a-4794-4521-ba4b-0ee6549e4cf1-kube-api-access-nws44\") on node \"crc\" DevicePath \"\"" Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.359385 4783 generic.go:334] "Generic (PLEG): container finished" podID="426da86a-4794-4521-ba4b-0ee6549e4cf1" containerID="aaed137ebcc63ee3c478aa90f9cc0e50acb8e872d483078e3c38e99e17cc13a3" exitCode=0 Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.359434 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbf4j" event={"ID":"426da86a-4794-4521-ba4b-0ee6549e4cf1","Type":"ContainerDied","Data":"aaed137ebcc63ee3c478aa90f9cc0e50acb8e872d483078e3c38e99e17cc13a3"} Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.359651 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbf4j" event={"ID":"426da86a-4794-4521-ba4b-0ee6549e4cf1","Type":"ContainerDied","Data":"9bd01a25f4944995c25e6f53395eb919e2e470bdbfa556d2bc161c234857d9b6"} Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.359468 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbf4j" Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.359695 4783 scope.go:117] "RemoveContainer" containerID="aaed137ebcc63ee3c478aa90f9cc0e50acb8e872d483078e3c38e99e17cc13a3" Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.361112 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5t9s8" event={"ID":"faa7114c-d950-4737-bfd5-9d45fc386e5c","Type":"ContainerStarted","Data":"44857b46a31a2d7a2fdfacfa2030c0466d737ce94504e707e675fe9b50a83385"} Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.373948 4783 scope.go:117] "RemoveContainer" containerID="2136d7a66a44989e9f90467f212896ec4492391cca8056bd91aa99ae1809b242" Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.403446 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kbf4j"] Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.411178 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kbf4j"] Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.413374 4783 scope.go:117] "RemoveContainer" containerID="983f98a77b44b150f7e5180babe989c0b4f1ae66690f30c49caa1d1ec21aeff1" Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.430684 4783 scope.go:117] "RemoveContainer" containerID="aaed137ebcc63ee3c478aa90f9cc0e50acb8e872d483078e3c38e99e17cc13a3" Jan 31 09:40:55 crc kubenswrapper[4783]: E0131 09:40:55.430968 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaed137ebcc63ee3c478aa90f9cc0e50acb8e872d483078e3c38e99e17cc13a3\": container with ID starting with aaed137ebcc63ee3c478aa90f9cc0e50acb8e872d483078e3c38e99e17cc13a3 not found: ID does not exist" containerID="aaed137ebcc63ee3c478aa90f9cc0e50acb8e872d483078e3c38e99e17cc13a3" Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.430997 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaed137ebcc63ee3c478aa90f9cc0e50acb8e872d483078e3c38e99e17cc13a3"} err="failed to get container status \"aaed137ebcc63ee3c478aa90f9cc0e50acb8e872d483078e3c38e99e17cc13a3\": rpc error: code = NotFound desc = could not find container \"aaed137ebcc63ee3c478aa90f9cc0e50acb8e872d483078e3c38e99e17cc13a3\": container with ID starting with aaed137ebcc63ee3c478aa90f9cc0e50acb8e872d483078e3c38e99e17cc13a3 not found: ID does not exist" Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.431018 4783 scope.go:117] "RemoveContainer" containerID="2136d7a66a44989e9f90467f212896ec4492391cca8056bd91aa99ae1809b242" Jan 31 09:40:55 crc kubenswrapper[4783]: E0131 09:40:55.431240 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2136d7a66a44989e9f90467f212896ec4492391cca8056bd91aa99ae1809b242\": container with ID starting with 2136d7a66a44989e9f90467f212896ec4492391cca8056bd91aa99ae1809b242 not found: ID does not exist" containerID="2136d7a66a44989e9f90467f212896ec4492391cca8056bd91aa99ae1809b242" Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.431263 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2136d7a66a44989e9f90467f212896ec4492391cca8056bd91aa99ae1809b242"} err="failed to get container status \"2136d7a66a44989e9f90467f212896ec4492391cca8056bd91aa99ae1809b242\": rpc error: code = NotFound desc = could not find container \"2136d7a66a44989e9f90467f212896ec4492391cca8056bd91aa99ae1809b242\": container with ID starting with 2136d7a66a44989e9f90467f212896ec4492391cca8056bd91aa99ae1809b242 not found: ID does not exist" Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.431323 4783 scope.go:117] "RemoveContainer" containerID="983f98a77b44b150f7e5180babe989c0b4f1ae66690f30c49caa1d1ec21aeff1" Jan 31 09:40:55 crc kubenswrapper[4783]: E0131 09:40:55.431490 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"983f98a77b44b150f7e5180babe989c0b4f1ae66690f30c49caa1d1ec21aeff1\": container with ID starting with 983f98a77b44b150f7e5180babe989c0b4f1ae66690f30c49caa1d1ec21aeff1 not found: ID does not exist" containerID="983f98a77b44b150f7e5180babe989c0b4f1ae66690f30c49caa1d1ec21aeff1" Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.431511 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"983f98a77b44b150f7e5180babe989c0b4f1ae66690f30c49caa1d1ec21aeff1"} err="failed to get container status \"983f98a77b44b150f7e5180babe989c0b4f1ae66690f30c49caa1d1ec21aeff1\": rpc error: code = NotFound desc = could not find container \"983f98a77b44b150f7e5180babe989c0b4f1ae66690f30c49caa1d1ec21aeff1\": container with ID starting with 983f98a77b44b150f7e5180babe989c0b4f1ae66690f30c49caa1d1ec21aeff1 not found: ID does not exist" Jan 31 09:40:55 crc kubenswrapper[4783]: I0131 09:40:55.654890 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="426da86a-4794-4521-ba4b-0ee6549e4cf1" path="/var/lib/kubelet/pods/426da86a-4794-4521-ba4b-0ee6549e4cf1/volumes" Jan 31 09:40:56 crc kubenswrapper[4783]: I0131 09:40:56.373034 4783 generic.go:334] "Generic (PLEG): container finished" podID="faa7114c-d950-4737-bfd5-9d45fc386e5c" containerID="44857b46a31a2d7a2fdfacfa2030c0466d737ce94504e707e675fe9b50a83385" exitCode=0 Jan 31 09:40:56 crc kubenswrapper[4783]: I0131 09:40:56.373098 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5t9s8" event={"ID":"faa7114c-d950-4737-bfd5-9d45fc386e5c","Type":"ContainerDied","Data":"44857b46a31a2d7a2fdfacfa2030c0466d737ce94504e707e675fe9b50a83385"} Jan 31 09:40:57 crc kubenswrapper[4783]: I0131 09:40:57.387367 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5t9s8" event={"ID":"faa7114c-d950-4737-bfd5-9d45fc386e5c","Type":"ContainerStarted","Data":"cb7bcc7794988cee618c292bbbfb80f883489738323cf29b87c85360454a706b"} Jan 31 09:40:57 crc kubenswrapper[4783]: I0131 09:40:57.401540 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5t9s8" podStartSLOduration=2.899169683 podStartE2EDuration="5.401518876s" podCreationTimestamp="2026-01-31 09:40:52 +0000 UTC" firstStartedPulling="2026-01-31 09:40:54.353482092 +0000 UTC m=+2165.022165561" lastFinishedPulling="2026-01-31 09:40:56.855831286 +0000 UTC m=+2167.524514754" observedRunningTime="2026-01-31 09:40:57.399570603 +0000 UTC m=+2168.068254071" watchObservedRunningTime="2026-01-31 09:40:57.401518876 +0000 UTC m=+2168.070202344" Jan 31 09:41:02 crc kubenswrapper[4783]: I0131 09:41:02.982372 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:41:02 crc kubenswrapper[4783]: I0131 09:41:02.983050 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:41:03 crc kubenswrapper[4783]: I0131 09:41:03.022133 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:41:03 crc kubenswrapper[4783]: I0131 09:41:03.478820 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:41:03 crc kubenswrapper[4783]: I0131 09:41:03.533648 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5t9s8"] Jan 31 09:41:05 crc kubenswrapper[4783]: I0131 09:41:05.457264 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5t9s8" podUID="faa7114c-d950-4737-bfd5-9d45fc386e5c" containerName="registry-server" containerID="cri-o://cb7bcc7794988cee618c292bbbfb80f883489738323cf29b87c85360454a706b" gracePeriod=2 Jan 31 09:41:05 crc kubenswrapper[4783]: I0131 09:41:05.921139 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.112272 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44nzn\" (UniqueName: \"kubernetes.io/projected/faa7114c-d950-4737-bfd5-9d45fc386e5c-kube-api-access-44nzn\") pod \"faa7114c-d950-4737-bfd5-9d45fc386e5c\" (UID: \"faa7114c-d950-4737-bfd5-9d45fc386e5c\") " Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.112382 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa7114c-d950-4737-bfd5-9d45fc386e5c-utilities\") pod \"faa7114c-d950-4737-bfd5-9d45fc386e5c\" (UID: \"faa7114c-d950-4737-bfd5-9d45fc386e5c\") " Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.112466 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa7114c-d950-4737-bfd5-9d45fc386e5c-catalog-content\") pod \"faa7114c-d950-4737-bfd5-9d45fc386e5c\" (UID: \"faa7114c-d950-4737-bfd5-9d45fc386e5c\") " Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.113213 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa7114c-d950-4737-bfd5-9d45fc386e5c-utilities" (OuterVolumeSpecName: "utilities") pod "faa7114c-d950-4737-bfd5-9d45fc386e5c" (UID: "faa7114c-d950-4737-bfd5-9d45fc386e5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.113565 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa7114c-d950-4737-bfd5-9d45fc386e5c-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.119634 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faa7114c-d950-4737-bfd5-9d45fc386e5c-kube-api-access-44nzn" (OuterVolumeSpecName: "kube-api-access-44nzn") pod "faa7114c-d950-4737-bfd5-9d45fc386e5c" (UID: "faa7114c-d950-4737-bfd5-9d45fc386e5c"). InnerVolumeSpecName "kube-api-access-44nzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.150734 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa7114c-d950-4737-bfd5-9d45fc386e5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "faa7114c-d950-4737-bfd5-9d45fc386e5c" (UID: "faa7114c-d950-4737-bfd5-9d45fc386e5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.215625 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa7114c-d950-4737-bfd5-9d45fc386e5c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.215665 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44nzn\" (UniqueName: \"kubernetes.io/projected/faa7114c-d950-4737-bfd5-9d45fc386e5c-kube-api-access-44nzn\") on node \"crc\" DevicePath \"\"" Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.466061 4783 generic.go:334] "Generic (PLEG): container finished" podID="faa7114c-d950-4737-bfd5-9d45fc386e5c" containerID="cb7bcc7794988cee618c292bbbfb80f883489738323cf29b87c85360454a706b" exitCode=0 Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.466109 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5t9s8" event={"ID":"faa7114c-d950-4737-bfd5-9d45fc386e5c","Type":"ContainerDied","Data":"cb7bcc7794988cee618c292bbbfb80f883489738323cf29b87c85360454a706b"} Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.466139 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5t9s8" event={"ID":"faa7114c-d950-4737-bfd5-9d45fc386e5c","Type":"ContainerDied","Data":"212797a8c47a5e28f5784d91eda31d5fb4f60fca22c8bfce7bee72243f3e7a16"} Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.466114 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5t9s8" Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.466158 4783 scope.go:117] "RemoveContainer" containerID="cb7bcc7794988cee618c292bbbfb80f883489738323cf29b87c85360454a706b" Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.487935 4783 scope.go:117] "RemoveContainer" containerID="44857b46a31a2d7a2fdfacfa2030c0466d737ce94504e707e675fe9b50a83385" Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.493237 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5t9s8"] Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.499208 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5t9s8"] Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.530820 4783 scope.go:117] "RemoveContainer" containerID="f78b332a974852861894a6355424a725a336363aefc8fee52d0eb204e20044f6" Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.547481 4783 scope.go:117] "RemoveContainer" containerID="cb7bcc7794988cee618c292bbbfb80f883489738323cf29b87c85360454a706b" Jan 31 09:41:06 crc kubenswrapper[4783]: E0131 09:41:06.547789 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb7bcc7794988cee618c292bbbfb80f883489738323cf29b87c85360454a706b\": container with ID starting with cb7bcc7794988cee618c292bbbfb80f883489738323cf29b87c85360454a706b not found: ID does not exist" containerID="cb7bcc7794988cee618c292bbbfb80f883489738323cf29b87c85360454a706b" Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.547844 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7bcc7794988cee618c292bbbfb80f883489738323cf29b87c85360454a706b"} err="failed to get container status \"cb7bcc7794988cee618c292bbbfb80f883489738323cf29b87c85360454a706b\": rpc error: code = NotFound desc = could not find container \"cb7bcc7794988cee618c292bbbfb80f883489738323cf29b87c85360454a706b\": container with ID starting with cb7bcc7794988cee618c292bbbfb80f883489738323cf29b87c85360454a706b not found: ID does not exist" Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.547871 4783 scope.go:117] "RemoveContainer" containerID="44857b46a31a2d7a2fdfacfa2030c0466d737ce94504e707e675fe9b50a83385" Jan 31 09:41:06 crc kubenswrapper[4783]: E0131 09:41:06.548251 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44857b46a31a2d7a2fdfacfa2030c0466d737ce94504e707e675fe9b50a83385\": container with ID starting with 44857b46a31a2d7a2fdfacfa2030c0466d737ce94504e707e675fe9b50a83385 not found: ID does not exist" containerID="44857b46a31a2d7a2fdfacfa2030c0466d737ce94504e707e675fe9b50a83385" Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.548361 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44857b46a31a2d7a2fdfacfa2030c0466d737ce94504e707e675fe9b50a83385"} err="failed to get container status \"44857b46a31a2d7a2fdfacfa2030c0466d737ce94504e707e675fe9b50a83385\": rpc error: code = NotFound desc = could not find container \"44857b46a31a2d7a2fdfacfa2030c0466d737ce94504e707e675fe9b50a83385\": container with ID starting with 44857b46a31a2d7a2fdfacfa2030c0466d737ce94504e707e675fe9b50a83385 not found: ID does not exist" Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.548451 4783 scope.go:117] "RemoveContainer" containerID="f78b332a974852861894a6355424a725a336363aefc8fee52d0eb204e20044f6" Jan 31 09:41:06 crc kubenswrapper[4783]: E0131 09:41:06.548885 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f78b332a974852861894a6355424a725a336363aefc8fee52d0eb204e20044f6\": container with ID starting with f78b332a974852861894a6355424a725a336363aefc8fee52d0eb204e20044f6 not found: ID does not exist" containerID="f78b332a974852861894a6355424a725a336363aefc8fee52d0eb204e20044f6" Jan 31 09:41:06 crc kubenswrapper[4783]: I0131 09:41:06.548916 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f78b332a974852861894a6355424a725a336363aefc8fee52d0eb204e20044f6"} err="failed to get container status \"f78b332a974852861894a6355424a725a336363aefc8fee52d0eb204e20044f6\": rpc error: code = NotFound desc = could not find container \"f78b332a974852861894a6355424a725a336363aefc8fee52d0eb204e20044f6\": container with ID starting with f78b332a974852861894a6355424a725a336363aefc8fee52d0eb204e20044f6 not found: ID does not exist" Jan 31 09:41:07 crc kubenswrapper[4783]: I0131 09:41:07.658607 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faa7114c-d950-4737-bfd5-9d45fc386e5c" path="/var/lib/kubelet/pods/faa7114c-d950-4737-bfd5-9d45fc386e5c/volumes" Jan 31 09:42:17 crc kubenswrapper[4783]: I0131 09:42:17.756947 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:42:17 crc kubenswrapper[4783]: I0131 09:42:17.757575 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:42:47 crc kubenswrapper[4783]: I0131 09:42:47.756976 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:42:47 crc kubenswrapper[4783]: I0131 09:42:47.757452 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:43:17 crc kubenswrapper[4783]: I0131 09:43:17.756302 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:43:17 crc kubenswrapper[4783]: I0131 09:43:17.756833 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:43:17 crc kubenswrapper[4783]: I0131 09:43:17.756881 4783 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:43:17 crc kubenswrapper[4783]: I0131 09:43:17.757537 4783 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9"} pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:43:17 crc kubenswrapper[4783]: I0131 09:43:17.757589 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" containerID="cri-o://6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" gracePeriod=600 Jan 31 09:43:17 crc kubenswrapper[4783]: E0131 09:43:17.875743 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:43:18 crc kubenswrapper[4783]: I0131 09:43:18.414505 4783 generic.go:334] "Generic (PLEG): container finished" podID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" exitCode=0 Jan 31 09:43:18 crc kubenswrapper[4783]: I0131 09:43:18.414545 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerDied","Data":"6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9"} Jan 31 09:43:18 crc kubenswrapper[4783]: I0131 09:43:18.414584 4783 scope.go:117] "RemoveContainer" containerID="7e1f687b072d842405d8baf7e29e05dc4192665ce37624bd66d9efc7fe830200" Jan 31 09:43:18 crc kubenswrapper[4783]: I0131 09:43:18.414999 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:43:18 crc kubenswrapper[4783]: E0131 09:43:18.415435 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:43:32 crc kubenswrapper[4783]: I0131 09:43:32.645802 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:43:32 crc kubenswrapper[4783]: E0131 09:43:32.646568 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:43:46 crc kubenswrapper[4783]: I0131 09:43:46.646351 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:43:46 crc kubenswrapper[4783]: E0131 09:43:46.647124 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:44:01 crc kubenswrapper[4783]: I0131 09:44:01.645772 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:44:01 crc kubenswrapper[4783]: E0131 09:44:01.646601 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:44:16 crc kubenswrapper[4783]: I0131 09:44:16.646667 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:44:16 crc kubenswrapper[4783]: E0131 09:44:16.647591 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:44:31 crc kubenswrapper[4783]: I0131 09:44:31.645595 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:44:31 crc kubenswrapper[4783]: E0131 09:44:31.646247 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:44:43 crc kubenswrapper[4783]: I0131 09:44:43.645295 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:44:43 crc kubenswrapper[4783]: E0131 09:44:43.646095 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:44:55 crc kubenswrapper[4783]: I0131 09:44:55.645980 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:44:55 crc kubenswrapper[4783]: E0131 09:44:55.646905 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.140583 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x"] Jan 31 09:45:00 crc kubenswrapper[4783]: E0131 09:45:00.141709 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426da86a-4794-4521-ba4b-0ee6549e4cf1" containerName="extract-utilities" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.141722 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="426da86a-4794-4521-ba4b-0ee6549e4cf1" containerName="extract-utilities" Jan 31 09:45:00 crc kubenswrapper[4783]: E0131 09:45:00.141737 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa7114c-d950-4737-bfd5-9d45fc386e5c" containerName="extract-utilities" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.141742 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa7114c-d950-4737-bfd5-9d45fc386e5c" containerName="extract-utilities" Jan 31 09:45:00 crc kubenswrapper[4783]: E0131 09:45:00.141751 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426da86a-4794-4521-ba4b-0ee6549e4cf1" containerName="extract-content" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.141757 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="426da86a-4794-4521-ba4b-0ee6549e4cf1" containerName="extract-content" Jan 31 09:45:00 crc kubenswrapper[4783]: E0131 09:45:00.141771 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa7114c-d950-4737-bfd5-9d45fc386e5c" containerName="registry-server" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.141778 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa7114c-d950-4737-bfd5-9d45fc386e5c" containerName="registry-server" Jan 31 09:45:00 crc kubenswrapper[4783]: E0131 09:45:00.141791 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426da86a-4794-4521-ba4b-0ee6549e4cf1" containerName="registry-server" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.141796 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="426da86a-4794-4521-ba4b-0ee6549e4cf1" containerName="registry-server" Jan 31 09:45:00 crc kubenswrapper[4783]: E0131 09:45:00.141804 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa7114c-d950-4737-bfd5-9d45fc386e5c" containerName="extract-content" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.141809 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa7114c-d950-4737-bfd5-9d45fc386e5c" containerName="extract-content" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.141993 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa7114c-d950-4737-bfd5-9d45fc386e5c" containerName="registry-server" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.142016 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="426da86a-4794-4521-ba4b-0ee6549e4cf1" containerName="registry-server" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.142674 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.144608 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.145889 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.151250 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x"] Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.221068 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42a71827-93fd-4953-be74-787fca18706f-secret-volume\") pod \"collect-profiles-29497545-jvw5x\" (UID: \"42a71827-93fd-4953-be74-787fca18706f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.221295 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdg2v\" (UniqueName: \"kubernetes.io/projected/42a71827-93fd-4953-be74-787fca18706f-kube-api-access-pdg2v\") pod \"collect-profiles-29497545-jvw5x\" (UID: \"42a71827-93fd-4953-be74-787fca18706f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.221406 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42a71827-93fd-4953-be74-787fca18706f-config-volume\") pod \"collect-profiles-29497545-jvw5x\" (UID: \"42a71827-93fd-4953-be74-787fca18706f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.322924 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42a71827-93fd-4953-be74-787fca18706f-secret-volume\") pod \"collect-profiles-29497545-jvw5x\" (UID: \"42a71827-93fd-4953-be74-787fca18706f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.323029 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdg2v\" (UniqueName: \"kubernetes.io/projected/42a71827-93fd-4953-be74-787fca18706f-kube-api-access-pdg2v\") pod \"collect-profiles-29497545-jvw5x\" (UID: \"42a71827-93fd-4953-be74-787fca18706f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.323082 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42a71827-93fd-4953-be74-787fca18706f-config-volume\") pod \"collect-profiles-29497545-jvw5x\" (UID: \"42a71827-93fd-4953-be74-787fca18706f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.324577 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42a71827-93fd-4953-be74-787fca18706f-config-volume\") pod \"collect-profiles-29497545-jvw5x\" (UID: \"42a71827-93fd-4953-be74-787fca18706f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.331088 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42a71827-93fd-4953-be74-787fca18706f-secret-volume\") pod \"collect-profiles-29497545-jvw5x\" (UID: \"42a71827-93fd-4953-be74-787fca18706f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.337684 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdg2v\" (UniqueName: \"kubernetes.io/projected/42a71827-93fd-4953-be74-787fca18706f-kube-api-access-pdg2v\") pod \"collect-profiles-29497545-jvw5x\" (UID: \"42a71827-93fd-4953-be74-787fca18706f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.457899 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" Jan 31 09:45:00 crc kubenswrapper[4783]: I0131 09:45:00.836457 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x"] Jan 31 09:45:01 crc kubenswrapper[4783]: I0131 09:45:01.089246 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" event={"ID":"42a71827-93fd-4953-be74-787fca18706f","Type":"ContainerStarted","Data":"4151d061392eacefc2dcfc2be06e2a3952360d5e09b30316e1da0c4cd3411584"} Jan 31 09:45:01 crc kubenswrapper[4783]: I0131 09:45:01.089302 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" event={"ID":"42a71827-93fd-4953-be74-787fca18706f","Type":"ContainerStarted","Data":"6009f93554742e095320c7a64b01679395de7bb79c9abc95a3af66b4f1ca3e78"} Jan 31 09:45:01 crc kubenswrapper[4783]: I0131 09:45:01.108772 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" podStartSLOduration=1.108751936 podStartE2EDuration="1.108751936s" podCreationTimestamp="2026-01-31 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:45:01.102473859 +0000 UTC m=+2411.771157327" watchObservedRunningTime="2026-01-31 09:45:01.108751936 +0000 UTC m=+2411.777435405" Jan 31 09:45:02 crc kubenswrapper[4783]: I0131 09:45:02.099017 4783 generic.go:334] "Generic (PLEG): container finished" podID="42a71827-93fd-4953-be74-787fca18706f" containerID="4151d061392eacefc2dcfc2be06e2a3952360d5e09b30316e1da0c4cd3411584" exitCode=0 Jan 31 09:45:02 crc kubenswrapper[4783]: I0131 09:45:02.099089 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" event={"ID":"42a71827-93fd-4953-be74-787fca18706f","Type":"ContainerDied","Data":"4151d061392eacefc2dcfc2be06e2a3952360d5e09b30316e1da0c4cd3411584"} Jan 31 09:45:03 crc kubenswrapper[4783]: I0131 09:45:03.395705 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" Jan 31 09:45:03 crc kubenswrapper[4783]: I0131 09:45:03.578342 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdg2v\" (UniqueName: \"kubernetes.io/projected/42a71827-93fd-4953-be74-787fca18706f-kube-api-access-pdg2v\") pod \"42a71827-93fd-4953-be74-787fca18706f\" (UID: \"42a71827-93fd-4953-be74-787fca18706f\") " Jan 31 09:45:03 crc kubenswrapper[4783]: I0131 09:45:03.578659 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42a71827-93fd-4953-be74-787fca18706f-config-volume\") pod \"42a71827-93fd-4953-be74-787fca18706f\" (UID: \"42a71827-93fd-4953-be74-787fca18706f\") " Jan 31 09:45:03 crc kubenswrapper[4783]: I0131 09:45:03.578681 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42a71827-93fd-4953-be74-787fca18706f-secret-volume\") pod \"42a71827-93fd-4953-be74-787fca18706f\" (UID: \"42a71827-93fd-4953-be74-787fca18706f\") " Jan 31 09:45:03 crc kubenswrapper[4783]: I0131 09:45:03.579129 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42a71827-93fd-4953-be74-787fca18706f-config-volume" (OuterVolumeSpecName: "config-volume") pod "42a71827-93fd-4953-be74-787fca18706f" (UID: "42a71827-93fd-4953-be74-787fca18706f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:03 crc kubenswrapper[4783]: I0131 09:45:03.583277 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42a71827-93fd-4953-be74-787fca18706f-kube-api-access-pdg2v" (OuterVolumeSpecName: "kube-api-access-pdg2v") pod "42a71827-93fd-4953-be74-787fca18706f" (UID: "42a71827-93fd-4953-be74-787fca18706f"). InnerVolumeSpecName "kube-api-access-pdg2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:03 crc kubenswrapper[4783]: I0131 09:45:03.583613 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42a71827-93fd-4953-be74-787fca18706f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "42a71827-93fd-4953-be74-787fca18706f" (UID: "42a71827-93fd-4953-be74-787fca18706f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:03 crc kubenswrapper[4783]: I0131 09:45:03.680832 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdg2v\" (UniqueName: \"kubernetes.io/projected/42a71827-93fd-4953-be74-787fca18706f-kube-api-access-pdg2v\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:03 crc kubenswrapper[4783]: I0131 09:45:03.680863 4783 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42a71827-93fd-4953-be74-787fca18706f-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:03 crc kubenswrapper[4783]: I0131 09:45:03.680921 4783 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42a71827-93fd-4953-be74-787fca18706f-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:04 crc kubenswrapper[4783]: I0131 09:45:04.113245 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" event={"ID":"42a71827-93fd-4953-be74-787fca18706f","Type":"ContainerDied","Data":"6009f93554742e095320c7a64b01679395de7bb79c9abc95a3af66b4f1ca3e78"} Jan 31 09:45:04 crc kubenswrapper[4783]: I0131 09:45:04.113289 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6009f93554742e095320c7a64b01679395de7bb79c9abc95a3af66b4f1ca3e78" Jan 31 09:45:04 crc kubenswrapper[4783]: I0131 09:45:04.113283 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-jvw5x" Jan 31 09:45:04 crc kubenswrapper[4783]: I0131 09:45:04.450806 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g"] Jan 31 09:45:04 crc kubenswrapper[4783]: I0131 09:45:04.458797 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497500-ljz8g"] Jan 31 09:45:05 crc kubenswrapper[4783]: I0131 09:45:05.655023 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f01828fd-fba3-487c-a2c6-f5599e1c379d" path="/var/lib/kubelet/pods/f01828fd-fba3-487c-a2c6-f5599e1c379d/volumes" Jan 31 09:45:09 crc kubenswrapper[4783]: I0131 09:45:09.651348 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:45:09 crc kubenswrapper[4783]: E0131 09:45:09.651944 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:45:24 crc kubenswrapper[4783]: I0131 09:45:24.645747 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:45:24 crc kubenswrapper[4783]: E0131 09:45:24.646622 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:45:38 crc kubenswrapper[4783]: I0131 09:45:38.646611 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:45:38 crc kubenswrapper[4783]: E0131 09:45:38.647850 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:45:51 crc kubenswrapper[4783]: I0131 09:45:51.645747 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:45:51 crc kubenswrapper[4783]: E0131 09:45:51.646623 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:46:01 crc kubenswrapper[4783]: I0131 09:46:01.023586 4783 scope.go:117] "RemoveContainer" containerID="4ade95b367c1495ae5ec997fab2d073aca17ec4cedaa325800e7e230d740ec37" Jan 31 09:46:05 crc kubenswrapper[4783]: I0131 09:46:05.645912 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:46:05 crc kubenswrapper[4783]: E0131 09:46:05.647182 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:46:18 crc kubenswrapper[4783]: I0131 09:46:18.645911 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:46:18 crc kubenswrapper[4783]: E0131 09:46:18.646821 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:46:32 crc kubenswrapper[4783]: I0131 09:46:32.646201 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:46:32 crc kubenswrapper[4783]: E0131 09:46:32.647262 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:46:43 crc kubenswrapper[4783]: I0131 09:46:43.645520 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:46:43 crc kubenswrapper[4783]: E0131 09:46:43.646244 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:46:56 crc kubenswrapper[4783]: I0131 09:46:56.645726 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:46:56 crc kubenswrapper[4783]: E0131 09:46:56.646521 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:47:10 crc kubenswrapper[4783]: I0131 09:47:10.646431 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:47:10 crc kubenswrapper[4783]: E0131 09:47:10.647271 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:47:25 crc kubenswrapper[4783]: I0131 09:47:25.645829 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:47:25 crc kubenswrapper[4783]: E0131 09:47:25.646710 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:47:36 crc kubenswrapper[4783]: I0131 09:47:36.645720 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:47:36 crc kubenswrapper[4783]: E0131 09:47:36.646539 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:47:49 crc kubenswrapper[4783]: I0131 09:47:49.646611 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:47:49 crc kubenswrapper[4783]: E0131 09:47:49.647508 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:48:02 crc kubenswrapper[4783]: I0131 09:48:02.645447 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:48:02 crc kubenswrapper[4783]: E0131 09:48:02.646143 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:48:15 crc kubenswrapper[4783]: I0131 09:48:15.646572 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:48:15 crc kubenswrapper[4783]: E0131 09:48:15.647478 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:48:26 crc kubenswrapper[4783]: I0131 09:48:26.646501 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:48:27 crc kubenswrapper[4783]: I0131 09:48:27.680495 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerStarted","Data":"3d7e158b49ce59cc2fb38a7e77af20dfdc83de86afbe948aa4710ad4b8760eef"} Jan 31 09:48:47 crc kubenswrapper[4783]: I0131 09:48:47.519831 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-464s8"] Jan 31 09:48:47 crc kubenswrapper[4783]: E0131 09:48:47.541081 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42a71827-93fd-4953-be74-787fca18706f" containerName="collect-profiles" Jan 31 09:48:47 crc kubenswrapper[4783]: I0131 09:48:47.541108 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="42a71827-93fd-4953-be74-787fca18706f" containerName="collect-profiles" Jan 31 09:48:47 crc kubenswrapper[4783]: I0131 09:48:47.541584 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="42a71827-93fd-4953-be74-787fca18706f" containerName="collect-profiles" Jan 31 09:48:47 crc kubenswrapper[4783]: I0131 09:48:47.543759 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-464s8" Jan 31 09:48:47 crc kubenswrapper[4783]: I0131 09:48:47.567133 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-464s8"] Jan 31 09:48:47 crc kubenswrapper[4783]: I0131 09:48:47.593389 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-catalog-content\") pod \"community-operators-464s8\" (UID: \"6c9640f0-5b4d-4a67-8731-553c6f2b3f44\") " pod="openshift-marketplace/community-operators-464s8" Jan 31 09:48:47 crc kubenswrapper[4783]: I0131 09:48:47.593429 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-utilities\") pod \"community-operators-464s8\" (UID: \"6c9640f0-5b4d-4a67-8731-553c6f2b3f44\") " pod="openshift-marketplace/community-operators-464s8" Jan 31 09:48:47 crc kubenswrapper[4783]: I0131 09:48:47.593469 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snmbr\" (UniqueName: \"kubernetes.io/projected/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-kube-api-access-snmbr\") pod \"community-operators-464s8\" (UID: \"6c9640f0-5b4d-4a67-8731-553c6f2b3f44\") " pod="openshift-marketplace/community-operators-464s8" Jan 31 09:48:47 crc kubenswrapper[4783]: I0131 09:48:47.695125 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-catalog-content\") pod \"community-operators-464s8\" (UID: \"6c9640f0-5b4d-4a67-8731-553c6f2b3f44\") " pod="openshift-marketplace/community-operators-464s8" Jan 31 09:48:47 crc kubenswrapper[4783]: I0131 09:48:47.695419 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-utilities\") pod \"community-operators-464s8\" (UID: \"6c9640f0-5b4d-4a67-8731-553c6f2b3f44\") " pod="openshift-marketplace/community-operators-464s8" Jan 31 09:48:47 crc kubenswrapper[4783]: I0131 09:48:47.695461 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snmbr\" (UniqueName: \"kubernetes.io/projected/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-kube-api-access-snmbr\") pod \"community-operators-464s8\" (UID: \"6c9640f0-5b4d-4a67-8731-553c6f2b3f44\") " pod="openshift-marketplace/community-operators-464s8" Jan 31 09:48:47 crc kubenswrapper[4783]: I0131 09:48:47.695597 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-catalog-content\") pod \"community-operators-464s8\" (UID: \"6c9640f0-5b4d-4a67-8731-553c6f2b3f44\") " pod="openshift-marketplace/community-operators-464s8" Jan 31 09:48:47 crc kubenswrapper[4783]: I0131 09:48:47.695936 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-utilities\") pod \"community-operators-464s8\" (UID: \"6c9640f0-5b4d-4a67-8731-553c6f2b3f44\") " pod="openshift-marketplace/community-operators-464s8" Jan 31 09:48:47 crc kubenswrapper[4783]: I0131 09:48:47.712945 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snmbr\" (UniqueName: \"kubernetes.io/projected/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-kube-api-access-snmbr\") pod \"community-operators-464s8\" (UID: \"6c9640f0-5b4d-4a67-8731-553c6f2b3f44\") " pod="openshift-marketplace/community-operators-464s8" Jan 31 09:48:47 crc kubenswrapper[4783]: I0131 09:48:47.868727 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-464s8" Jan 31 09:48:48 crc kubenswrapper[4783]: I0131 09:48:48.332316 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-464s8"] Jan 31 09:48:48 crc kubenswrapper[4783]: I0131 09:48:48.890308 4783 generic.go:334] "Generic (PLEG): container finished" podID="6c9640f0-5b4d-4a67-8731-553c6f2b3f44" containerID="4d1ba4041b8c7a262848a954d356ed464b5f0da43445e32132dbf97b25600ca5" exitCode=0 Jan 31 09:48:48 crc kubenswrapper[4783]: I0131 09:48:48.890521 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-464s8" event={"ID":"6c9640f0-5b4d-4a67-8731-553c6f2b3f44","Type":"ContainerDied","Data":"4d1ba4041b8c7a262848a954d356ed464b5f0da43445e32132dbf97b25600ca5"} Jan 31 09:48:48 crc kubenswrapper[4783]: I0131 09:48:48.890651 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-464s8" event={"ID":"6c9640f0-5b4d-4a67-8731-553c6f2b3f44","Type":"ContainerStarted","Data":"239b1385ffe1699ec8d9788031e8ab18e6a5179454eb538391aa4216f30daff9"} Jan 31 09:48:48 crc kubenswrapper[4783]: I0131 09:48:48.893725 4783 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:48:49 crc kubenswrapper[4783]: I0131 09:48:49.901854 4783 generic.go:334] "Generic (PLEG): container finished" podID="6c9640f0-5b4d-4a67-8731-553c6f2b3f44" containerID="9df66d1d2631a0fbc9176f0acd2863d8d19ef5a7163535f6c0809f529a2d98b9" exitCode=0 Jan 31 09:48:49 crc kubenswrapper[4783]: I0131 09:48:49.901911 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-464s8" event={"ID":"6c9640f0-5b4d-4a67-8731-553c6f2b3f44","Type":"ContainerDied","Data":"9df66d1d2631a0fbc9176f0acd2863d8d19ef5a7163535f6c0809f529a2d98b9"} Jan 31 09:48:50 crc kubenswrapper[4783]: I0131 09:48:50.911817 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-464s8" event={"ID":"6c9640f0-5b4d-4a67-8731-553c6f2b3f44","Type":"ContainerStarted","Data":"381c4b23c7024fd89d13101010121b4c17f1e8fb8feba43ef24cc1699084953d"} Jan 31 09:48:50 crc kubenswrapper[4783]: I0131 09:48:50.930534 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-464s8" podStartSLOduration=2.434831303 podStartE2EDuration="3.930515883s" podCreationTimestamp="2026-01-31 09:48:47 +0000 UTC" firstStartedPulling="2026-01-31 09:48:48.893471828 +0000 UTC m=+2639.562155296" lastFinishedPulling="2026-01-31 09:48:50.389156408 +0000 UTC m=+2641.057839876" observedRunningTime="2026-01-31 09:48:50.930027923 +0000 UTC m=+2641.598711390" watchObservedRunningTime="2026-01-31 09:48:50.930515883 +0000 UTC m=+2641.599199350" Jan 31 09:48:56 crc kubenswrapper[4783]: I0131 09:48:56.969798 4783 generic.go:334] "Generic (PLEG): container finished" podID="16292449-5a29-426e-aa57-18e752dd60f6" containerID="b79ff5f5df22d1e0f297e28214d49ee54bd81eb75693e44846c1afd5b475d2d8" exitCode=0 Jan 31 09:48:56 crc kubenswrapper[4783]: I0131 09:48:56.969890 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"16292449-5a29-426e-aa57-18e752dd60f6","Type":"ContainerDied","Data":"b79ff5f5df22d1e0f297e28214d49ee54bd81eb75693e44846c1afd5b475d2d8"} Jan 31 09:48:57 crc kubenswrapper[4783]: I0131 09:48:57.870734 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-464s8" Jan 31 09:48:57 crc kubenswrapper[4783]: I0131 09:48:57.870771 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-464s8" Jan 31 09:48:57 crc kubenswrapper[4783]: I0131 09:48:57.904992 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-464s8" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.016732 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-464s8" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.264775 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.343251 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/16292449-5a29-426e-aa57-18e752dd60f6-test-operator-ephemeral-temporary\") pod \"16292449-5a29-426e-aa57-18e752dd60f6\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.343322 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16292449-5a29-426e-aa57-18e752dd60f6-config-data\") pod \"16292449-5a29-426e-aa57-18e752dd60f6\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.343361 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-openstack-config-secret\") pod \"16292449-5a29-426e-aa57-18e752dd60f6\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.343403 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"16292449-5a29-426e-aa57-18e752dd60f6\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.343524 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/16292449-5a29-426e-aa57-18e752dd60f6-test-operator-ephemeral-workdir\") pod \"16292449-5a29-426e-aa57-18e752dd60f6\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.343547 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqq4d\" (UniqueName: \"kubernetes.io/projected/16292449-5a29-426e-aa57-18e752dd60f6-kube-api-access-bqq4d\") pod \"16292449-5a29-426e-aa57-18e752dd60f6\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.343634 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-ssh-key\") pod \"16292449-5a29-426e-aa57-18e752dd60f6\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.343659 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-ca-certs\") pod \"16292449-5a29-426e-aa57-18e752dd60f6\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.343679 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/16292449-5a29-426e-aa57-18e752dd60f6-openstack-config\") pod \"16292449-5a29-426e-aa57-18e752dd60f6\" (UID: \"16292449-5a29-426e-aa57-18e752dd60f6\") " Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.344096 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16292449-5a29-426e-aa57-18e752dd60f6-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "16292449-5a29-426e-aa57-18e752dd60f6" (UID: "16292449-5a29-426e-aa57-18e752dd60f6"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.344549 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16292449-5a29-426e-aa57-18e752dd60f6-config-data" (OuterVolumeSpecName: "config-data") pod "16292449-5a29-426e-aa57-18e752dd60f6" (UID: "16292449-5a29-426e-aa57-18e752dd60f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.349617 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16292449-5a29-426e-aa57-18e752dd60f6-kube-api-access-bqq4d" (OuterVolumeSpecName: "kube-api-access-bqq4d") pod "16292449-5a29-426e-aa57-18e752dd60f6" (UID: "16292449-5a29-426e-aa57-18e752dd60f6"). InnerVolumeSpecName "kube-api-access-bqq4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.349685 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "test-operator-logs") pod "16292449-5a29-426e-aa57-18e752dd60f6" (UID: "16292449-5a29-426e-aa57-18e752dd60f6"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.349783 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16292449-5a29-426e-aa57-18e752dd60f6-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "16292449-5a29-426e-aa57-18e752dd60f6" (UID: "16292449-5a29-426e-aa57-18e752dd60f6"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.370125 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "16292449-5a29-426e-aa57-18e752dd60f6" (UID: "16292449-5a29-426e-aa57-18e752dd60f6"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.370562 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "16292449-5a29-426e-aa57-18e752dd60f6" (UID: "16292449-5a29-426e-aa57-18e752dd60f6"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.371195 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "16292449-5a29-426e-aa57-18e752dd60f6" (UID: "16292449-5a29-426e-aa57-18e752dd60f6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.382327 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16292449-5a29-426e-aa57-18e752dd60f6-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "16292449-5a29-426e-aa57-18e752dd60f6" (UID: "16292449-5a29-426e-aa57-18e752dd60f6"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.445871 4783 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/16292449-5a29-426e-aa57-18e752dd60f6-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.445909 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16292449-5a29-426e-aa57-18e752dd60f6-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.445927 4783 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.445960 4783 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.445971 4783 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/16292449-5a29-426e-aa57-18e752dd60f6-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.445980 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqq4d\" (UniqueName: \"kubernetes.io/projected/16292449-5a29-426e-aa57-18e752dd60f6-kube-api-access-bqq4d\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.445990 4783 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.445997 4783 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/16292449-5a29-426e-aa57-18e752dd60f6-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.446005 4783 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/16292449-5a29-426e-aa57-18e752dd60f6-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.460841 4783 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.517898 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-464s8"] Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.549448 4783 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.986012 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"16292449-5a29-426e-aa57-18e752dd60f6","Type":"ContainerDied","Data":"b818a9958edc8dd84b6997ca42ed80cf9d041806c0bd3daabe9f70708b9f0713"} Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.986208 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b818a9958edc8dd84b6997ca42ed80cf9d041806c0bd3daabe9f70708b9f0713" Jan 31 09:48:58 crc kubenswrapper[4783]: I0131 09:48:58.986064 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 31 09:48:59 crc kubenswrapper[4783]: I0131 09:48:59.992784 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-464s8" podUID="6c9640f0-5b4d-4a67-8731-553c6f2b3f44" containerName="registry-server" containerID="cri-o://381c4b23c7024fd89d13101010121b4c17f1e8fb8feba43ef24cc1699084953d" gracePeriod=2 Jan 31 09:49:00 crc kubenswrapper[4783]: I0131 09:49:00.384406 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-464s8" Jan 31 09:49:00 crc kubenswrapper[4783]: I0131 09:49:00.487458 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snmbr\" (UniqueName: \"kubernetes.io/projected/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-kube-api-access-snmbr\") pod \"6c9640f0-5b4d-4a67-8731-553c6f2b3f44\" (UID: \"6c9640f0-5b4d-4a67-8731-553c6f2b3f44\") " Jan 31 09:49:00 crc kubenswrapper[4783]: I0131 09:49:00.487566 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-utilities\") pod \"6c9640f0-5b4d-4a67-8731-553c6f2b3f44\" (UID: \"6c9640f0-5b4d-4a67-8731-553c6f2b3f44\") " Jan 31 09:49:00 crc kubenswrapper[4783]: I0131 09:49:00.487726 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-catalog-content\") pod \"6c9640f0-5b4d-4a67-8731-553c6f2b3f44\" (UID: \"6c9640f0-5b4d-4a67-8731-553c6f2b3f44\") " Jan 31 09:49:00 crc kubenswrapper[4783]: I0131 09:49:00.488490 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-utilities" (OuterVolumeSpecName: "utilities") pod "6c9640f0-5b4d-4a67-8731-553c6f2b3f44" (UID: "6c9640f0-5b4d-4a67-8731-553c6f2b3f44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:49:00 crc kubenswrapper[4783]: I0131 09:49:00.494519 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-kube-api-access-snmbr" (OuterVolumeSpecName: "kube-api-access-snmbr") pod "6c9640f0-5b4d-4a67-8731-553c6f2b3f44" (UID: "6c9640f0-5b4d-4a67-8731-553c6f2b3f44"). InnerVolumeSpecName "kube-api-access-snmbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:49:00 crc kubenswrapper[4783]: I0131 09:49:00.525944 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c9640f0-5b4d-4a67-8731-553c6f2b3f44" (UID: "6c9640f0-5b4d-4a67-8731-553c6f2b3f44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:49:00 crc kubenswrapper[4783]: I0131 09:49:00.590043 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snmbr\" (UniqueName: \"kubernetes.io/projected/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-kube-api-access-snmbr\") on node \"crc\" DevicePath \"\"" Jan 31 09:49:00 crc kubenswrapper[4783]: I0131 09:49:00.590073 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:49:00 crc kubenswrapper[4783]: I0131 09:49:00.590083 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9640f0-5b4d-4a67-8731-553c6f2b3f44-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:49:01 crc kubenswrapper[4783]: I0131 09:49:01.009086 4783 generic.go:334] "Generic (PLEG): container finished" podID="6c9640f0-5b4d-4a67-8731-553c6f2b3f44" containerID="381c4b23c7024fd89d13101010121b4c17f1e8fb8feba43ef24cc1699084953d" exitCode=0 Jan 31 09:49:01 crc kubenswrapper[4783]: I0131 09:49:01.009139 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-464s8" event={"ID":"6c9640f0-5b4d-4a67-8731-553c6f2b3f44","Type":"ContainerDied","Data":"381c4b23c7024fd89d13101010121b4c17f1e8fb8feba43ef24cc1699084953d"} Jan 31 09:49:01 crc kubenswrapper[4783]: I0131 09:49:01.009574 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-464s8" event={"ID":"6c9640f0-5b4d-4a67-8731-553c6f2b3f44","Type":"ContainerDied","Data":"239b1385ffe1699ec8d9788031e8ab18e6a5179454eb538391aa4216f30daff9"} Jan 31 09:49:01 crc kubenswrapper[4783]: I0131 09:49:01.009602 4783 scope.go:117] "RemoveContainer" containerID="381c4b23c7024fd89d13101010121b4c17f1e8fb8feba43ef24cc1699084953d" Jan 31 09:49:01 crc kubenswrapper[4783]: I0131 09:49:01.009217 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-464s8" Jan 31 09:49:01 crc kubenswrapper[4783]: I0131 09:49:01.040507 4783 scope.go:117] "RemoveContainer" containerID="9df66d1d2631a0fbc9176f0acd2863d8d19ef5a7163535f6c0809f529a2d98b9" Jan 31 09:49:01 crc kubenswrapper[4783]: I0131 09:49:01.050055 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-464s8"] Jan 31 09:49:01 crc kubenswrapper[4783]: I0131 09:49:01.061963 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-464s8"] Jan 31 09:49:01 crc kubenswrapper[4783]: I0131 09:49:01.061962 4783 scope.go:117] "RemoveContainer" containerID="4d1ba4041b8c7a262848a954d356ed464b5f0da43445e32132dbf97b25600ca5" Jan 31 09:49:01 crc kubenswrapper[4783]: I0131 09:49:01.101988 4783 scope.go:117] "RemoveContainer" containerID="381c4b23c7024fd89d13101010121b4c17f1e8fb8feba43ef24cc1699084953d" Jan 31 09:49:01 crc kubenswrapper[4783]: E0131 09:49:01.104730 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"381c4b23c7024fd89d13101010121b4c17f1e8fb8feba43ef24cc1699084953d\": container with ID starting with 381c4b23c7024fd89d13101010121b4c17f1e8fb8feba43ef24cc1699084953d not found: ID does not exist" containerID="381c4b23c7024fd89d13101010121b4c17f1e8fb8feba43ef24cc1699084953d" Jan 31 09:49:01 crc kubenswrapper[4783]: I0131 09:49:01.104775 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381c4b23c7024fd89d13101010121b4c17f1e8fb8feba43ef24cc1699084953d"} err="failed to get container status \"381c4b23c7024fd89d13101010121b4c17f1e8fb8feba43ef24cc1699084953d\": rpc error: code = NotFound desc = could not find container \"381c4b23c7024fd89d13101010121b4c17f1e8fb8feba43ef24cc1699084953d\": container with ID starting with 381c4b23c7024fd89d13101010121b4c17f1e8fb8feba43ef24cc1699084953d not found: ID does not exist" Jan 31 09:49:01 crc kubenswrapper[4783]: I0131 09:49:01.104798 4783 scope.go:117] "RemoveContainer" containerID="9df66d1d2631a0fbc9176f0acd2863d8d19ef5a7163535f6c0809f529a2d98b9" Jan 31 09:49:01 crc kubenswrapper[4783]: E0131 09:49:01.105011 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9df66d1d2631a0fbc9176f0acd2863d8d19ef5a7163535f6c0809f529a2d98b9\": container with ID starting with 9df66d1d2631a0fbc9176f0acd2863d8d19ef5a7163535f6c0809f529a2d98b9 not found: ID does not exist" containerID="9df66d1d2631a0fbc9176f0acd2863d8d19ef5a7163535f6c0809f529a2d98b9" Jan 31 09:49:01 crc kubenswrapper[4783]: I0131 09:49:01.105034 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9df66d1d2631a0fbc9176f0acd2863d8d19ef5a7163535f6c0809f529a2d98b9"} err="failed to get container status \"9df66d1d2631a0fbc9176f0acd2863d8d19ef5a7163535f6c0809f529a2d98b9\": rpc error: code = NotFound desc = could not find container \"9df66d1d2631a0fbc9176f0acd2863d8d19ef5a7163535f6c0809f529a2d98b9\": container with ID starting with 9df66d1d2631a0fbc9176f0acd2863d8d19ef5a7163535f6c0809f529a2d98b9 not found: ID does not exist" Jan 31 09:49:01 crc kubenswrapper[4783]: I0131 09:49:01.105054 4783 scope.go:117] "RemoveContainer" containerID="4d1ba4041b8c7a262848a954d356ed464b5f0da43445e32132dbf97b25600ca5" Jan 31 09:49:01 crc kubenswrapper[4783]: E0131 09:49:01.106136 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1ba4041b8c7a262848a954d356ed464b5f0da43445e32132dbf97b25600ca5\": container with ID starting with 4d1ba4041b8c7a262848a954d356ed464b5f0da43445e32132dbf97b25600ca5 not found: ID does not exist" containerID="4d1ba4041b8c7a262848a954d356ed464b5f0da43445e32132dbf97b25600ca5" Jan 31 09:49:01 crc kubenswrapper[4783]: I0131 09:49:01.106206 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1ba4041b8c7a262848a954d356ed464b5f0da43445e32132dbf97b25600ca5"} err="failed to get container status \"4d1ba4041b8c7a262848a954d356ed464b5f0da43445e32132dbf97b25600ca5\": rpc error: code = NotFound desc = could not find container \"4d1ba4041b8c7a262848a954d356ed464b5f0da43445e32132dbf97b25600ca5\": container with ID starting with 4d1ba4041b8c7a262848a954d356ed464b5f0da43445e32132dbf97b25600ca5 not found: ID does not exist" Jan 31 09:49:01 crc kubenswrapper[4783]: I0131 09:49:01.656330 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c9640f0-5b4d-4a67-8731-553c6f2b3f44" path="/var/lib/kubelet/pods/6c9640f0-5b4d-4a67-8731-553c6f2b3f44/volumes" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.426927 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 31 09:49:02 crc kubenswrapper[4783]: E0131 09:49:02.427879 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c9640f0-5b4d-4a67-8731-553c6f2b3f44" containerName="registry-server" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.427903 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9640f0-5b4d-4a67-8731-553c6f2b3f44" containerName="registry-server" Jan 31 09:49:02 crc kubenswrapper[4783]: E0131 09:49:02.427931 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c9640f0-5b4d-4a67-8731-553c6f2b3f44" containerName="extract-utilities" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.427941 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9640f0-5b4d-4a67-8731-553c6f2b3f44" containerName="extract-utilities" Jan 31 09:49:02 crc kubenswrapper[4783]: E0131 09:49:02.427955 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c9640f0-5b4d-4a67-8731-553c6f2b3f44" containerName="extract-content" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.427961 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9640f0-5b4d-4a67-8731-553c6f2b3f44" containerName="extract-content" Jan 31 09:49:02 crc kubenswrapper[4783]: E0131 09:49:02.427975 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16292449-5a29-426e-aa57-18e752dd60f6" containerName="tempest-tests-tempest-tests-runner" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.427983 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="16292449-5a29-426e-aa57-18e752dd60f6" containerName="tempest-tests-tempest-tests-runner" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.428279 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c9640f0-5b4d-4a67-8731-553c6f2b3f44" containerName="registry-server" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.428293 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="16292449-5a29-426e-aa57-18e752dd60f6" containerName="tempest-tests-tempest-tests-runner" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.429296 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.432744 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6f9wl" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.434255 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.530262 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3438915b-514e-4298-842a-c77aefe49803\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.530307 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbbbd\" (UniqueName: \"kubernetes.io/projected/3438915b-514e-4298-842a-c77aefe49803-kube-api-access-pbbbd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3438915b-514e-4298-842a-c77aefe49803\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.632497 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3438915b-514e-4298-842a-c77aefe49803\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.632554 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbbbd\" (UniqueName: \"kubernetes.io/projected/3438915b-514e-4298-842a-c77aefe49803-kube-api-access-pbbbd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3438915b-514e-4298-842a-c77aefe49803\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.632854 4783 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3438915b-514e-4298-842a-c77aefe49803\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.660008 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbbbd\" (UniqueName: \"kubernetes.io/projected/3438915b-514e-4298-842a-c77aefe49803-kube-api-access-pbbbd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3438915b-514e-4298-842a-c77aefe49803\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.661404 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3438915b-514e-4298-842a-c77aefe49803\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 09:49:02 crc kubenswrapper[4783]: I0131 09:49:02.749789 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 09:49:03 crc kubenswrapper[4783]: I0131 09:49:03.137063 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 31 09:49:04 crc kubenswrapper[4783]: I0131 09:49:04.035263 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"3438915b-514e-4298-842a-c77aefe49803","Type":"ContainerStarted","Data":"684e2103656fe95b7def5580f4e12955ad850d20385279f2f6786e489a24fec0"} Jan 31 09:49:05 crc kubenswrapper[4783]: I0131 09:49:05.047930 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"3438915b-514e-4298-842a-c77aefe49803","Type":"ContainerStarted","Data":"e2cc0e8a879855cb38f3326211d7389e836543089df35de71fd0a278483cf795"} Jan 31 09:49:05 crc kubenswrapper[4783]: I0131 09:49:05.065079 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.161379523 podStartE2EDuration="3.065060944s" podCreationTimestamp="2026-01-31 09:49:02 +0000 UTC" firstStartedPulling="2026-01-31 09:49:03.142403775 +0000 UTC m=+2653.811087243" lastFinishedPulling="2026-01-31 09:49:04.046085196 +0000 UTC m=+2654.714768664" observedRunningTime="2026-01-31 09:49:05.060284659 +0000 UTC m=+2655.728968126" watchObservedRunningTime="2026-01-31 09:49:05.065060944 +0000 UTC m=+2655.733744411" Jan 31 09:49:22 crc kubenswrapper[4783]: I0131 09:49:22.634708 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f5rgv/must-gather-l9j56"] Jan 31 09:49:22 crc kubenswrapper[4783]: I0131 09:49:22.637350 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5rgv/must-gather-l9j56" Jan 31 09:49:22 crc kubenswrapper[4783]: I0131 09:49:22.641800 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-f5rgv"/"openshift-service-ca.crt" Jan 31 09:49:22 crc kubenswrapper[4783]: I0131 09:49:22.641956 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-f5rgv"/"kube-root-ca.crt" Jan 31 09:49:22 crc kubenswrapper[4783]: I0131 09:49:22.651391 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5rgv/must-gather-l9j56"] Jan 31 09:49:22 crc kubenswrapper[4783]: I0131 09:49:22.713203 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7771c91-54d0-4aec-aacf-267513a0eea2-must-gather-output\") pod \"must-gather-l9j56\" (UID: \"a7771c91-54d0-4aec-aacf-267513a0eea2\") " pod="openshift-must-gather-f5rgv/must-gather-l9j56" Jan 31 09:49:22 crc kubenswrapper[4783]: I0131 09:49:22.713298 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn8j5\" (UniqueName: \"kubernetes.io/projected/a7771c91-54d0-4aec-aacf-267513a0eea2-kube-api-access-xn8j5\") pod \"must-gather-l9j56\" (UID: \"a7771c91-54d0-4aec-aacf-267513a0eea2\") " pod="openshift-must-gather-f5rgv/must-gather-l9j56" Jan 31 09:49:22 crc kubenswrapper[4783]: I0131 09:49:22.815193 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7771c91-54d0-4aec-aacf-267513a0eea2-must-gather-output\") pod \"must-gather-l9j56\" (UID: \"a7771c91-54d0-4aec-aacf-267513a0eea2\") " pod="openshift-must-gather-f5rgv/must-gather-l9j56" Jan 31 09:49:22 crc kubenswrapper[4783]: I0131 09:49:22.815261 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn8j5\" (UniqueName: \"kubernetes.io/projected/a7771c91-54d0-4aec-aacf-267513a0eea2-kube-api-access-xn8j5\") pod \"must-gather-l9j56\" (UID: \"a7771c91-54d0-4aec-aacf-267513a0eea2\") " pod="openshift-must-gather-f5rgv/must-gather-l9j56" Jan 31 09:49:22 crc kubenswrapper[4783]: I0131 09:49:22.815710 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7771c91-54d0-4aec-aacf-267513a0eea2-must-gather-output\") pod \"must-gather-l9j56\" (UID: \"a7771c91-54d0-4aec-aacf-267513a0eea2\") " pod="openshift-must-gather-f5rgv/must-gather-l9j56" Jan 31 09:49:22 crc kubenswrapper[4783]: I0131 09:49:22.837509 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn8j5\" (UniqueName: \"kubernetes.io/projected/a7771c91-54d0-4aec-aacf-267513a0eea2-kube-api-access-xn8j5\") pod \"must-gather-l9j56\" (UID: \"a7771c91-54d0-4aec-aacf-267513a0eea2\") " pod="openshift-must-gather-f5rgv/must-gather-l9j56" Jan 31 09:49:22 crc kubenswrapper[4783]: I0131 09:49:22.955047 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5rgv/must-gather-l9j56" Jan 31 09:49:23 crc kubenswrapper[4783]: I0131 09:49:23.353879 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5rgv/must-gather-l9j56"] Jan 31 09:49:23 crc kubenswrapper[4783]: W0131 09:49:23.357020 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7771c91_54d0_4aec_aacf_267513a0eea2.slice/crio-e2a0fd8f4904dd8e23da05a469c0dc030b487c09cf69f6a826eff53ce542ed63 WatchSource:0}: Error finding container e2a0fd8f4904dd8e23da05a469c0dc030b487c09cf69f6a826eff53ce542ed63: Status 404 returned error can't find the container with id e2a0fd8f4904dd8e23da05a469c0dc030b487c09cf69f6a826eff53ce542ed63 Jan 31 09:49:24 crc kubenswrapper[4783]: I0131 09:49:24.215557 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5rgv/must-gather-l9j56" event={"ID":"a7771c91-54d0-4aec-aacf-267513a0eea2","Type":"ContainerStarted","Data":"e2a0fd8f4904dd8e23da05a469c0dc030b487c09cf69f6a826eff53ce542ed63"} Jan 31 09:49:28 crc kubenswrapper[4783]: I0131 09:49:28.249890 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5rgv/must-gather-l9j56" event={"ID":"a7771c91-54d0-4aec-aacf-267513a0eea2","Type":"ContainerStarted","Data":"3914445ec00f61c2d5da25f7bce90a71fbc916f5864f19f7200f04a4d61c77a2"} Jan 31 09:49:28 crc kubenswrapper[4783]: I0131 09:49:28.250495 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5rgv/must-gather-l9j56" event={"ID":"a7771c91-54d0-4aec-aacf-267513a0eea2","Type":"ContainerStarted","Data":"6b1b6ba4969b810c9d93a1977b178a89eca60ef39a705ab887b25ca3ad2b640d"} Jan 31 09:49:28 crc kubenswrapper[4783]: I0131 09:49:28.279435 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f5rgv/must-gather-l9j56" podStartSLOduration=1.999655551 podStartE2EDuration="6.27940704s" podCreationTimestamp="2026-01-31 09:49:22 +0000 UTC" firstStartedPulling="2026-01-31 09:49:23.359324924 +0000 UTC m=+2674.028008392" lastFinishedPulling="2026-01-31 09:49:27.639076412 +0000 UTC m=+2678.307759881" observedRunningTime="2026-01-31 09:49:28.266065835 +0000 UTC m=+2678.934749303" watchObservedRunningTime="2026-01-31 09:49:28.27940704 +0000 UTC m=+2678.948090507" Jan 31 09:49:31 crc kubenswrapper[4783]: I0131 09:49:31.165574 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f5rgv/crc-debug-n92dj"] Jan 31 09:49:31 crc kubenswrapper[4783]: I0131 09:49:31.168883 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5rgv/crc-debug-n92dj" Jan 31 09:49:31 crc kubenswrapper[4783]: I0131 09:49:31.171221 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-f5rgv"/"default-dockercfg-frk8h" Jan 31 09:49:31 crc kubenswrapper[4783]: I0131 09:49:31.188607 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4njj\" (UniqueName: \"kubernetes.io/projected/ab1f6efd-8417-4761-9b40-f3d54c6d8374-kube-api-access-x4njj\") pod \"crc-debug-n92dj\" (UID: \"ab1f6efd-8417-4761-9b40-f3d54c6d8374\") " pod="openshift-must-gather-f5rgv/crc-debug-n92dj" Jan 31 09:49:31 crc kubenswrapper[4783]: I0131 09:49:31.189136 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab1f6efd-8417-4761-9b40-f3d54c6d8374-host\") pod \"crc-debug-n92dj\" (UID: \"ab1f6efd-8417-4761-9b40-f3d54c6d8374\") " pod="openshift-must-gather-f5rgv/crc-debug-n92dj" Jan 31 09:49:31 crc kubenswrapper[4783]: I0131 09:49:31.291584 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab1f6efd-8417-4761-9b40-f3d54c6d8374-host\") pod \"crc-debug-n92dj\" (UID: \"ab1f6efd-8417-4761-9b40-f3d54c6d8374\") " pod="openshift-must-gather-f5rgv/crc-debug-n92dj" Jan 31 09:49:31 crc kubenswrapper[4783]: I0131 09:49:31.291770 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4njj\" (UniqueName: \"kubernetes.io/projected/ab1f6efd-8417-4761-9b40-f3d54c6d8374-kube-api-access-x4njj\") pod \"crc-debug-n92dj\" (UID: \"ab1f6efd-8417-4761-9b40-f3d54c6d8374\") " pod="openshift-must-gather-f5rgv/crc-debug-n92dj" Jan 31 09:49:31 crc kubenswrapper[4783]: I0131 09:49:31.291782 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab1f6efd-8417-4761-9b40-f3d54c6d8374-host\") pod \"crc-debug-n92dj\" (UID: \"ab1f6efd-8417-4761-9b40-f3d54c6d8374\") " pod="openshift-must-gather-f5rgv/crc-debug-n92dj" Jan 31 09:49:31 crc kubenswrapper[4783]: I0131 09:49:31.315765 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4njj\" (UniqueName: \"kubernetes.io/projected/ab1f6efd-8417-4761-9b40-f3d54c6d8374-kube-api-access-x4njj\") pod \"crc-debug-n92dj\" (UID: \"ab1f6efd-8417-4761-9b40-f3d54c6d8374\") " pod="openshift-must-gather-f5rgv/crc-debug-n92dj" Jan 31 09:49:31 crc kubenswrapper[4783]: I0131 09:49:31.493287 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5rgv/crc-debug-n92dj" Jan 31 09:49:31 crc kubenswrapper[4783]: W0131 09:49:31.522844 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab1f6efd_8417_4761_9b40_f3d54c6d8374.slice/crio-b238e3b512cb8ec5078ae017ead3e997201a441a171db74263dea624335c0ec5 WatchSource:0}: Error finding container b238e3b512cb8ec5078ae017ead3e997201a441a171db74263dea624335c0ec5: Status 404 returned error can't find the container with id b238e3b512cb8ec5078ae017ead3e997201a441a171db74263dea624335c0ec5 Jan 31 09:49:32 crc kubenswrapper[4783]: I0131 09:49:32.286777 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5rgv/crc-debug-n92dj" event={"ID":"ab1f6efd-8417-4761-9b40-f3d54c6d8374","Type":"ContainerStarted","Data":"b238e3b512cb8ec5078ae017ead3e997201a441a171db74263dea624335c0ec5"} Jan 31 09:49:41 crc kubenswrapper[4783]: I0131 09:49:41.366429 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5rgv/crc-debug-n92dj" event={"ID":"ab1f6efd-8417-4761-9b40-f3d54c6d8374","Type":"ContainerStarted","Data":"c3ae5158febf577f33113a8f5df4077fc04ea665e47e4a7d534ff23ca583a06a"} Jan 31 09:49:41 crc kubenswrapper[4783]: I0131 09:49:41.386594 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f5rgv/crc-debug-n92dj" podStartSLOduration=0.957928002 podStartE2EDuration="10.386575434s" podCreationTimestamp="2026-01-31 09:49:31 +0000 UTC" firstStartedPulling="2026-01-31 09:49:31.524943558 +0000 UTC m=+2682.193627026" lastFinishedPulling="2026-01-31 09:49:40.95359099 +0000 UTC m=+2691.622274458" observedRunningTime="2026-01-31 09:49:41.380590859 +0000 UTC m=+2692.049274328" watchObservedRunningTime="2026-01-31 09:49:41.386575434 +0000 UTC m=+2692.055258901" Jan 31 09:50:12 crc kubenswrapper[4783]: I0131 09:50:12.616970 4783 generic.go:334] "Generic (PLEG): container finished" podID="ab1f6efd-8417-4761-9b40-f3d54c6d8374" containerID="c3ae5158febf577f33113a8f5df4077fc04ea665e47e4a7d534ff23ca583a06a" exitCode=0 Jan 31 09:50:12 crc kubenswrapper[4783]: I0131 09:50:12.617104 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5rgv/crc-debug-n92dj" event={"ID":"ab1f6efd-8417-4761-9b40-f3d54c6d8374","Type":"ContainerDied","Data":"c3ae5158febf577f33113a8f5df4077fc04ea665e47e4a7d534ff23ca583a06a"} Jan 31 09:50:13 crc kubenswrapper[4783]: I0131 09:50:13.723286 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5rgv/crc-debug-n92dj" Jan 31 09:50:13 crc kubenswrapper[4783]: I0131 09:50:13.763532 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f5rgv/crc-debug-n92dj"] Jan 31 09:50:13 crc kubenswrapper[4783]: I0131 09:50:13.771308 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f5rgv/crc-debug-n92dj"] Jan 31 09:50:13 crc kubenswrapper[4783]: I0131 09:50:13.849036 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab1f6efd-8417-4761-9b40-f3d54c6d8374-host\") pod \"ab1f6efd-8417-4761-9b40-f3d54c6d8374\" (UID: \"ab1f6efd-8417-4761-9b40-f3d54c6d8374\") " Jan 31 09:50:13 crc kubenswrapper[4783]: I0131 09:50:13.849118 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4njj\" (UniqueName: \"kubernetes.io/projected/ab1f6efd-8417-4761-9b40-f3d54c6d8374-kube-api-access-x4njj\") pod \"ab1f6efd-8417-4761-9b40-f3d54c6d8374\" (UID: \"ab1f6efd-8417-4761-9b40-f3d54c6d8374\") " Jan 31 09:50:13 crc kubenswrapper[4783]: I0131 09:50:13.849223 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab1f6efd-8417-4761-9b40-f3d54c6d8374-host" (OuterVolumeSpecName: "host") pod "ab1f6efd-8417-4761-9b40-f3d54c6d8374" (UID: "ab1f6efd-8417-4761-9b40-f3d54c6d8374"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:50:13 crc kubenswrapper[4783]: I0131 09:50:13.850584 4783 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab1f6efd-8417-4761-9b40-f3d54c6d8374-host\") on node \"crc\" DevicePath \"\"" Jan 31 09:50:13 crc kubenswrapper[4783]: I0131 09:50:13.855343 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab1f6efd-8417-4761-9b40-f3d54c6d8374-kube-api-access-x4njj" (OuterVolumeSpecName: "kube-api-access-x4njj") pod "ab1f6efd-8417-4761-9b40-f3d54c6d8374" (UID: "ab1f6efd-8417-4761-9b40-f3d54c6d8374"). InnerVolumeSpecName "kube-api-access-x4njj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:50:13 crc kubenswrapper[4783]: I0131 09:50:13.952719 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4njj\" (UniqueName: \"kubernetes.io/projected/ab1f6efd-8417-4761-9b40-f3d54c6d8374-kube-api-access-x4njj\") on node \"crc\" DevicePath \"\"" Jan 31 09:50:14 crc kubenswrapper[4783]: I0131 09:50:14.633920 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b238e3b512cb8ec5078ae017ead3e997201a441a171db74263dea624335c0ec5" Jan 31 09:50:14 crc kubenswrapper[4783]: I0131 09:50:14.634044 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5rgv/crc-debug-n92dj" Jan 31 09:50:14 crc kubenswrapper[4783]: I0131 09:50:14.908406 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f5rgv/crc-debug-55fsn"] Jan 31 09:50:14 crc kubenswrapper[4783]: E0131 09:50:14.908870 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab1f6efd-8417-4761-9b40-f3d54c6d8374" containerName="container-00" Jan 31 09:50:14 crc kubenswrapper[4783]: I0131 09:50:14.908883 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab1f6efd-8417-4761-9b40-f3d54c6d8374" containerName="container-00" Jan 31 09:50:14 crc kubenswrapper[4783]: I0131 09:50:14.909083 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab1f6efd-8417-4761-9b40-f3d54c6d8374" containerName="container-00" Jan 31 09:50:14 crc kubenswrapper[4783]: I0131 09:50:14.911320 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5rgv/crc-debug-55fsn" Jan 31 09:50:14 crc kubenswrapper[4783]: I0131 09:50:14.913064 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-f5rgv"/"default-dockercfg-frk8h" Jan 31 09:50:15 crc kubenswrapper[4783]: I0131 09:50:15.079071 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd47g\" (UniqueName: \"kubernetes.io/projected/38a214b3-023f-40b7-aef4-ecdebcf463fc-kube-api-access-qd47g\") pod \"crc-debug-55fsn\" (UID: \"38a214b3-023f-40b7-aef4-ecdebcf463fc\") " pod="openshift-must-gather-f5rgv/crc-debug-55fsn" Jan 31 09:50:15 crc kubenswrapper[4783]: I0131 09:50:15.079725 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38a214b3-023f-40b7-aef4-ecdebcf463fc-host\") pod \"crc-debug-55fsn\" (UID: \"38a214b3-023f-40b7-aef4-ecdebcf463fc\") " pod="openshift-must-gather-f5rgv/crc-debug-55fsn" Jan 31 09:50:15 crc kubenswrapper[4783]: I0131 09:50:15.181389 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd47g\" (UniqueName: \"kubernetes.io/projected/38a214b3-023f-40b7-aef4-ecdebcf463fc-kube-api-access-qd47g\") pod \"crc-debug-55fsn\" (UID: \"38a214b3-023f-40b7-aef4-ecdebcf463fc\") " pod="openshift-must-gather-f5rgv/crc-debug-55fsn" Jan 31 09:50:15 crc kubenswrapper[4783]: I0131 09:50:15.181520 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38a214b3-023f-40b7-aef4-ecdebcf463fc-host\") pod \"crc-debug-55fsn\" (UID: \"38a214b3-023f-40b7-aef4-ecdebcf463fc\") " pod="openshift-must-gather-f5rgv/crc-debug-55fsn" Jan 31 09:50:15 crc kubenswrapper[4783]: I0131 09:50:15.181664 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38a214b3-023f-40b7-aef4-ecdebcf463fc-host\") pod \"crc-debug-55fsn\" (UID: \"38a214b3-023f-40b7-aef4-ecdebcf463fc\") " pod="openshift-must-gather-f5rgv/crc-debug-55fsn" Jan 31 09:50:15 crc kubenswrapper[4783]: I0131 09:50:15.196796 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd47g\" (UniqueName: \"kubernetes.io/projected/38a214b3-023f-40b7-aef4-ecdebcf463fc-kube-api-access-qd47g\") pod \"crc-debug-55fsn\" (UID: \"38a214b3-023f-40b7-aef4-ecdebcf463fc\") " pod="openshift-must-gather-f5rgv/crc-debug-55fsn" Jan 31 09:50:15 crc kubenswrapper[4783]: I0131 09:50:15.226488 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5rgv/crc-debug-55fsn" Jan 31 09:50:15 crc kubenswrapper[4783]: I0131 09:50:15.645346 4783 generic.go:334] "Generic (PLEG): container finished" podID="38a214b3-023f-40b7-aef4-ecdebcf463fc" containerID="b2842cf3cd50069798490e5f4453e58831f617c027487b233458c25e766a6e91" exitCode=0 Jan 31 09:50:15 crc kubenswrapper[4783]: I0131 09:50:15.654030 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab1f6efd-8417-4761-9b40-f3d54c6d8374" path="/var/lib/kubelet/pods/ab1f6efd-8417-4761-9b40-f3d54c6d8374/volumes" Jan 31 09:50:15 crc kubenswrapper[4783]: I0131 09:50:15.654798 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5rgv/crc-debug-55fsn" event={"ID":"38a214b3-023f-40b7-aef4-ecdebcf463fc","Type":"ContainerDied","Data":"b2842cf3cd50069798490e5f4453e58831f617c027487b233458c25e766a6e91"} Jan 31 09:50:15 crc kubenswrapper[4783]: I0131 09:50:15.654849 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5rgv/crc-debug-55fsn" event={"ID":"38a214b3-023f-40b7-aef4-ecdebcf463fc","Type":"ContainerStarted","Data":"700b7c968670cdc3ad858f01f0c7edf4123b57473d55e0ae83c0a66b1a4a4f3b"} Jan 31 09:50:15 crc kubenswrapper[4783]: I0131 09:50:15.987020 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f5rgv/crc-debug-55fsn"] Jan 31 09:50:15 crc kubenswrapper[4783]: I0131 09:50:15.993768 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f5rgv/crc-debug-55fsn"] Jan 31 09:50:16 crc kubenswrapper[4783]: I0131 09:50:16.729411 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5rgv/crc-debug-55fsn" Jan 31 09:50:16 crc kubenswrapper[4783]: I0131 09:50:16.920227 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd47g\" (UniqueName: \"kubernetes.io/projected/38a214b3-023f-40b7-aef4-ecdebcf463fc-kube-api-access-qd47g\") pod \"38a214b3-023f-40b7-aef4-ecdebcf463fc\" (UID: \"38a214b3-023f-40b7-aef4-ecdebcf463fc\") " Jan 31 09:50:16 crc kubenswrapper[4783]: I0131 09:50:16.920386 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38a214b3-023f-40b7-aef4-ecdebcf463fc-host\") pod \"38a214b3-023f-40b7-aef4-ecdebcf463fc\" (UID: \"38a214b3-023f-40b7-aef4-ecdebcf463fc\") " Jan 31 09:50:16 crc kubenswrapper[4783]: I0131 09:50:16.920806 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38a214b3-023f-40b7-aef4-ecdebcf463fc-host" (OuterVolumeSpecName: "host") pod "38a214b3-023f-40b7-aef4-ecdebcf463fc" (UID: "38a214b3-023f-40b7-aef4-ecdebcf463fc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:50:16 crc kubenswrapper[4783]: I0131 09:50:16.925835 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a214b3-023f-40b7-aef4-ecdebcf463fc-kube-api-access-qd47g" (OuterVolumeSpecName: "kube-api-access-qd47g") pod "38a214b3-023f-40b7-aef4-ecdebcf463fc" (UID: "38a214b3-023f-40b7-aef4-ecdebcf463fc"). InnerVolumeSpecName "kube-api-access-qd47g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.022887 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd47g\" (UniqueName: \"kubernetes.io/projected/38a214b3-023f-40b7-aef4-ecdebcf463fc-kube-api-access-qd47g\") on node \"crc\" DevicePath \"\"" Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.022924 4783 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38a214b3-023f-40b7-aef4-ecdebcf463fc-host\") on node \"crc\" DevicePath \"\"" Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.137726 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f5rgv/crc-debug-gmzp2"] Jan 31 09:50:17 crc kubenswrapper[4783]: E0131 09:50:17.138208 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a214b3-023f-40b7-aef4-ecdebcf463fc" containerName="container-00" Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.138229 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a214b3-023f-40b7-aef4-ecdebcf463fc" containerName="container-00" Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.138437 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a214b3-023f-40b7-aef4-ecdebcf463fc" containerName="container-00" Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.139177 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5rgv/crc-debug-gmzp2" Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.330477 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8dq5\" (UniqueName: \"kubernetes.io/projected/0c1406d5-2707-4f81-a41e-f25af0ca8627-kube-api-access-l8dq5\") pod \"crc-debug-gmzp2\" (UID: \"0c1406d5-2707-4f81-a41e-f25af0ca8627\") " pod="openshift-must-gather-f5rgv/crc-debug-gmzp2" Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.331392 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c1406d5-2707-4f81-a41e-f25af0ca8627-host\") pod \"crc-debug-gmzp2\" (UID: \"0c1406d5-2707-4f81-a41e-f25af0ca8627\") " pod="openshift-must-gather-f5rgv/crc-debug-gmzp2" Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.433450 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c1406d5-2707-4f81-a41e-f25af0ca8627-host\") pod \"crc-debug-gmzp2\" (UID: \"0c1406d5-2707-4f81-a41e-f25af0ca8627\") " pod="openshift-must-gather-f5rgv/crc-debug-gmzp2" Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.433591 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8dq5\" (UniqueName: \"kubernetes.io/projected/0c1406d5-2707-4f81-a41e-f25af0ca8627-kube-api-access-l8dq5\") pod \"crc-debug-gmzp2\" (UID: \"0c1406d5-2707-4f81-a41e-f25af0ca8627\") " pod="openshift-must-gather-f5rgv/crc-debug-gmzp2" Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.433605 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c1406d5-2707-4f81-a41e-f25af0ca8627-host\") pod \"crc-debug-gmzp2\" (UID: \"0c1406d5-2707-4f81-a41e-f25af0ca8627\") " pod="openshift-must-gather-f5rgv/crc-debug-gmzp2" Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.448917 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8dq5\" (UniqueName: \"kubernetes.io/projected/0c1406d5-2707-4f81-a41e-f25af0ca8627-kube-api-access-l8dq5\") pod \"crc-debug-gmzp2\" (UID: \"0c1406d5-2707-4f81-a41e-f25af0ca8627\") " pod="openshift-must-gather-f5rgv/crc-debug-gmzp2" Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.454066 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5rgv/crc-debug-gmzp2" Jan 31 09:50:17 crc kubenswrapper[4783]: W0131 09:50:17.475012 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c1406d5_2707_4f81_a41e_f25af0ca8627.slice/crio-8bf1bf5f57da579247027cd65ea7e342aa4e92ad09c56cd2067745c43bfbbd55 WatchSource:0}: Error finding container 8bf1bf5f57da579247027cd65ea7e342aa4e92ad09c56cd2067745c43bfbbd55: Status 404 returned error can't find the container with id 8bf1bf5f57da579247027cd65ea7e342aa4e92ad09c56cd2067745c43bfbbd55 Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.654014 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a214b3-023f-40b7-aef4-ecdebcf463fc" path="/var/lib/kubelet/pods/38a214b3-023f-40b7-aef4-ecdebcf463fc/volumes" Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.661633 4783 scope.go:117] "RemoveContainer" containerID="b2842cf3cd50069798490e5f4453e58831f617c027487b233458c25e766a6e91" Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.661668 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5rgv/crc-debug-55fsn" Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.663232 4783 generic.go:334] "Generic (PLEG): container finished" podID="0c1406d5-2707-4f81-a41e-f25af0ca8627" containerID="b1eaa6af4efffd8b72d87d314ea705ddaf47f40a0f6ad41fc198845c827c21bc" exitCode=0 Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.663282 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5rgv/crc-debug-gmzp2" event={"ID":"0c1406d5-2707-4f81-a41e-f25af0ca8627","Type":"ContainerDied","Data":"b1eaa6af4efffd8b72d87d314ea705ddaf47f40a0f6ad41fc198845c827c21bc"} Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.663310 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5rgv/crc-debug-gmzp2" event={"ID":"0c1406d5-2707-4f81-a41e-f25af0ca8627","Type":"ContainerStarted","Data":"8bf1bf5f57da579247027cd65ea7e342aa4e92ad09c56cd2067745c43bfbbd55"} Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.708286 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f5rgv/crc-debug-gmzp2"] Jan 31 09:50:17 crc kubenswrapper[4783]: I0131 09:50:17.713625 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f5rgv/crc-debug-gmzp2"] Jan 31 09:50:18 crc kubenswrapper[4783]: I0131 09:50:18.752412 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5rgv/crc-debug-gmzp2" Jan 31 09:50:18 crc kubenswrapper[4783]: I0131 09:50:18.755882 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8dq5\" (UniqueName: \"kubernetes.io/projected/0c1406d5-2707-4f81-a41e-f25af0ca8627-kube-api-access-l8dq5\") pod \"0c1406d5-2707-4f81-a41e-f25af0ca8627\" (UID: \"0c1406d5-2707-4f81-a41e-f25af0ca8627\") " Jan 31 09:50:18 crc kubenswrapper[4783]: I0131 09:50:18.755925 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c1406d5-2707-4f81-a41e-f25af0ca8627-host\") pod \"0c1406d5-2707-4f81-a41e-f25af0ca8627\" (UID: \"0c1406d5-2707-4f81-a41e-f25af0ca8627\") " Jan 31 09:50:18 crc kubenswrapper[4783]: I0131 09:50:18.756198 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c1406d5-2707-4f81-a41e-f25af0ca8627-host" (OuterVolumeSpecName: "host") pod "0c1406d5-2707-4f81-a41e-f25af0ca8627" (UID: "0c1406d5-2707-4f81-a41e-f25af0ca8627"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:50:18 crc kubenswrapper[4783]: I0131 09:50:18.760373 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c1406d5-2707-4f81-a41e-f25af0ca8627-kube-api-access-l8dq5" (OuterVolumeSpecName: "kube-api-access-l8dq5") pod "0c1406d5-2707-4f81-a41e-f25af0ca8627" (UID: "0c1406d5-2707-4f81-a41e-f25af0ca8627"). InnerVolumeSpecName "kube-api-access-l8dq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:50:18 crc kubenswrapper[4783]: I0131 09:50:18.857674 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8dq5\" (UniqueName: \"kubernetes.io/projected/0c1406d5-2707-4f81-a41e-f25af0ca8627-kube-api-access-l8dq5\") on node \"crc\" DevicePath \"\"" Jan 31 09:50:18 crc kubenswrapper[4783]: I0131 09:50:18.857702 4783 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c1406d5-2707-4f81-a41e-f25af0ca8627-host\") on node \"crc\" DevicePath \"\"" Jan 31 09:50:19 crc kubenswrapper[4783]: I0131 09:50:19.655843 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c1406d5-2707-4f81-a41e-f25af0ca8627" path="/var/lib/kubelet/pods/0c1406d5-2707-4f81-a41e-f25af0ca8627/volumes" Jan 31 09:50:19 crc kubenswrapper[4783]: I0131 09:50:19.683498 4783 scope.go:117] "RemoveContainer" containerID="b1eaa6af4efffd8b72d87d314ea705ddaf47f40a0f6ad41fc198845c827c21bc" Jan 31 09:50:19 crc kubenswrapper[4783]: I0131 09:50:19.683546 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5rgv/crc-debug-gmzp2" Jan 31 09:50:32 crc kubenswrapper[4783]: I0131 09:50:32.404277 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7994d94564-47gt2_6ec6cc70-cb74-40f6-acb3-3423b5045651/barbican-api/0.log" Jan 31 09:50:32 crc kubenswrapper[4783]: I0131 09:50:32.540773 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7994d94564-47gt2_6ec6cc70-cb74-40f6-acb3-3423b5045651/barbican-api-log/0.log" Jan 31 09:50:32 crc kubenswrapper[4783]: I0131 09:50:32.566601 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5cf88c75b6-glzzx_3cd5ec39-81e0-44cd-b99f-01e3d301b192/barbican-keystone-listener/0.log" Jan 31 09:50:32 crc kubenswrapper[4783]: I0131 09:50:32.628938 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5cf88c75b6-glzzx_3cd5ec39-81e0-44cd-b99f-01e3d301b192/barbican-keystone-listener-log/0.log" Jan 31 09:50:32 crc kubenswrapper[4783]: I0131 09:50:32.749496 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c946647fc-lsk5p_814c0372-f441-4fce-b7d3-47827597fdd5/barbican-worker/0.log" Jan 31 09:50:32 crc kubenswrapper[4783]: I0131 09:50:32.769581 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c946647fc-lsk5p_814c0372-f441-4fce-b7d3-47827597fdd5/barbican-worker-log/0.log" Jan 31 09:50:32 crc kubenswrapper[4783]: I0131 09:50:32.932669 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj_b55d8f8e-46cb-4119-a32b-723b06e29764/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:50:32 crc kubenswrapper[4783]: I0131 09:50:32.969338 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4c9fe6f4-d5e6-4f59-8803-3e889c863d6c/ceilometer-central-agent/0.log" Jan 31 09:50:33 crc kubenswrapper[4783]: I0131 09:50:33.084361 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4c9fe6f4-d5e6-4f59-8803-3e889c863d6c/ceilometer-notification-agent/0.log" Jan 31 09:50:33 crc kubenswrapper[4783]: I0131 09:50:33.090281 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4c9fe6f4-d5e6-4f59-8803-3e889c863d6c/proxy-httpd/0.log" Jan 31 09:50:33 crc kubenswrapper[4783]: I0131 09:50:33.150330 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4c9fe6f4-d5e6-4f59-8803-3e889c863d6c/sg-core/0.log" Jan 31 09:50:33 crc kubenswrapper[4783]: I0131 09:50:33.251399 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_efc715c4-8350-4307-91b8-d33c62513e41/cinder-api-log/0.log" Jan 31 09:50:33 crc kubenswrapper[4783]: I0131 09:50:33.316905 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_efc715c4-8350-4307-91b8-d33c62513e41/cinder-api/0.log" Jan 31 09:50:33 crc kubenswrapper[4783]: I0131 09:50:33.420792 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1eb12305-93aa-4b0a-960a-939eb7b74bec/probe/0.log" Jan 31 09:50:33 crc kubenswrapper[4783]: I0131 09:50:33.460352 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1eb12305-93aa-4b0a-960a-939eb7b74bec/cinder-scheduler/0.log" Jan 31 09:50:33 crc kubenswrapper[4783]: I0131 09:50:33.680010 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj_4c333cb5-3633-4cfe-825d-abc93c751acd/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:50:33 crc kubenswrapper[4783]: I0131 09:50:33.808401 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj_49b1076f-b620-47cc-8cbf-c70ecdbeab06/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:50:33 crc kubenswrapper[4783]: I0131 09:50:33.942126 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-m8kdr_19000338-c242-44b7-a9e2-1a0c0c15f58b/init/0.log" Jan 31 09:50:34 crc kubenswrapper[4783]: I0131 09:50:34.090993 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-m8kdr_19000338-c242-44b7-a9e2-1a0c0c15f58b/init/0.log" Jan 31 09:50:34 crc kubenswrapper[4783]: I0131 09:50:34.157123 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-m8kdr_19000338-c242-44b7-a9e2-1a0c0c15f58b/dnsmasq-dns/0.log" Jan 31 09:50:34 crc kubenswrapper[4783]: I0131 09:50:34.185466 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc_f6315c3c-0101-4935-b081-37414dd7e27e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:50:34 crc kubenswrapper[4783]: I0131 09:50:34.329721 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7b9825cb-8bd2-446b-80ab-d6bdd294d51d/glance-log/0.log" Jan 31 09:50:34 crc kubenswrapper[4783]: I0131 09:50:34.341967 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7b9825cb-8bd2-446b-80ab-d6bdd294d51d/glance-httpd/0.log" Jan 31 09:50:34 crc kubenswrapper[4783]: I0131 09:50:34.483800 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6c4aab5d-1107-4452-9f07-fc45f446eb01/glance-httpd/0.log" Jan 31 09:50:34 crc kubenswrapper[4783]: I0131 09:50:34.522907 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6c4aab5d-1107-4452-9f07-fc45f446eb01/glance-log/0.log" Jan 31 09:50:34 crc kubenswrapper[4783]: I0131 09:50:34.613291 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6644bf8978-q24zg_940e2e96-d6a1-4576-b83a-e30ff1f6ab85/horizon/0.log" Jan 31 09:50:34 crc kubenswrapper[4783]: I0131 09:50:34.768151 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk_b59b9070-af55-4915-bae2-414ca2aab1b7/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:50:34 crc kubenswrapper[4783]: I0131 09:50:34.892305 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6644bf8978-q24zg_940e2e96-d6a1-4576-b83a-e30ff1f6ab85/horizon-log/0.log" Jan 31 09:50:34 crc kubenswrapper[4783]: I0131 09:50:34.973627 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-c9hls_77285e40-3f9b-491d-a911-0f7b5c8058fb/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:50:35 crc kubenswrapper[4783]: I0131 09:50:35.139113 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-688b79757c-l8xjk_5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae/keystone-api/0.log" Jan 31 09:50:35 crc kubenswrapper[4783]: I0131 09:50:35.160137 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_7c07732e-abfe-48cc-86c9-b500fff4977d/kube-state-metrics/0.log" Jan 31 09:50:35 crc kubenswrapper[4783]: I0131 09:50:35.376747 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm_777883d7-012b-4006-afcb-d5fcd8a0eb68/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:50:35 crc kubenswrapper[4783]: I0131 09:50:35.684843 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-754bf5467-627tt_e522ac0d-9e88-42f8-82a7-54cb22a15841/neutron-api/0.log" Jan 31 09:50:35 crc kubenswrapper[4783]: I0131 09:50:35.705784 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-754bf5467-627tt_e522ac0d-9e88-42f8-82a7-54cb22a15841/neutron-httpd/0.log" Jan 31 09:50:35 crc kubenswrapper[4783]: I0131 09:50:35.814183 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2_a22a7456-83e3-46ef-80c2-ebea731972b9/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:50:36 crc kubenswrapper[4783]: I0131 09:50:36.134170 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b4163808-2c2d-4fdd-a2b3-84b36dfa4112/nova-api-log/0.log" Jan 31 09:50:36 crc kubenswrapper[4783]: I0131 09:50:36.316438 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7/nova-cell0-conductor-conductor/0.log" Jan 31 09:50:36 crc kubenswrapper[4783]: I0131 09:50:36.318267 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b4163808-2c2d-4fdd-a2b3-84b36dfa4112/nova-api-api/0.log" Jan 31 09:50:36 crc kubenswrapper[4783]: I0131 09:50:36.435070 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_5d42e544-b623-439f-b4a8-9ee7cb72386c/nova-cell1-conductor-conductor/0.log" Jan 31 09:50:36 crc kubenswrapper[4783]: I0131 09:50:36.544840 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_4655d42d-6876-4b07-bde2-d8a70c62018d/nova-cell1-novncproxy-novncproxy/0.log" Jan 31 09:50:36 crc kubenswrapper[4783]: I0131 09:50:36.709698 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-q6sd4_57ec9c0f-9c30-4c10-afd7-84ac778f9069/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:50:36 crc kubenswrapper[4783]: I0131 09:50:36.832087 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_011e6c87-b549-4268-b64e-b5d49c9e7cd8/nova-metadata-log/0.log" Jan 31 09:50:37 crc kubenswrapper[4783]: I0131 09:50:37.072068 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_43315408-26b4-4864-af4e-e3cbad195816/nova-scheduler-scheduler/0.log" Jan 31 09:50:37 crc kubenswrapper[4783]: I0131 09:50:37.099991 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c9a5bd57-8542-4509-a620-c48d2f6c9e06/mysql-bootstrap/0.log" Jan 31 09:50:37 crc kubenswrapper[4783]: I0131 09:50:37.251678 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c9a5bd57-8542-4509-a620-c48d2f6c9e06/mysql-bootstrap/0.log" Jan 31 09:50:37 crc kubenswrapper[4783]: I0131 09:50:37.284813 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c9a5bd57-8542-4509-a620-c48d2f6c9e06/galera/0.log" Jan 31 09:50:37 crc kubenswrapper[4783]: I0131 09:50:37.468523 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_03eade59-3312-49be-a51a-9fdcd37f9a33/mysql-bootstrap/0.log" Jan 31 09:50:37 crc kubenswrapper[4783]: I0131 09:50:37.522074 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_011e6c87-b549-4268-b64e-b5d49c9e7cd8/nova-metadata-metadata/0.log" Jan 31 09:50:37 crc kubenswrapper[4783]: I0131 09:50:37.689495 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_03eade59-3312-49be-a51a-9fdcd37f9a33/mysql-bootstrap/0.log" Jan 31 09:50:37 crc kubenswrapper[4783]: I0131 09:50:37.708415 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_03eade59-3312-49be-a51a-9fdcd37f9a33/galera/0.log" Jan 31 09:50:37 crc kubenswrapper[4783]: I0131 09:50:37.736543 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b1789966-6119-4be7-87b8-cca3381fc380/openstackclient/0.log" Jan 31 09:50:38 crc kubenswrapper[4783]: I0131 09:50:38.000391 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-fmnfb_7e49f761-fcba-4ec3-9091-61f056e4eb58/openstack-network-exporter/0.log" Jan 31 09:50:38 crc kubenswrapper[4783]: I0131 09:50:38.026722 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7st6_8575964a-bedb-456c-b992-116f66bb7fa2/ovsdb-server-init/0.log" Jan 31 09:50:38 crc kubenswrapper[4783]: I0131 09:50:38.158505 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7st6_8575964a-bedb-456c-b992-116f66bb7fa2/ovsdb-server-init/0.log" Jan 31 09:50:38 crc kubenswrapper[4783]: I0131 09:50:38.209342 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7st6_8575964a-bedb-456c-b992-116f66bb7fa2/ovsdb-server/0.log" Jan 31 09:50:38 crc kubenswrapper[4783]: I0131 09:50:38.260582 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7st6_8575964a-bedb-456c-b992-116f66bb7fa2/ovs-vswitchd/0.log" Jan 31 09:50:38 crc kubenswrapper[4783]: I0131 09:50:38.365196 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rll65_ea827790-18ef-4c55-8b5f-365ead9b9f6c/ovn-controller/0.log" Jan 31 09:50:38 crc kubenswrapper[4783]: I0131 09:50:38.509191 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-zp2tb_cf5af9ea-73f9-4316-8fdc-abe4ede8632a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:50:38 crc kubenswrapper[4783]: I0131 09:50:38.561812 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_273da0f4-592f-4736-8435-28cd6f46ed55/openstack-network-exporter/0.log" Jan 31 09:50:38 crc kubenswrapper[4783]: I0131 09:50:38.651138 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_273da0f4-592f-4736-8435-28cd6f46ed55/ovn-northd/0.log" Jan 31 09:50:38 crc kubenswrapper[4783]: I0131 09:50:38.786843 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8e90cb35-3366-44df-9238-3da82d300654/openstack-network-exporter/0.log" Jan 31 09:50:38 crc kubenswrapper[4783]: I0131 09:50:38.866149 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8e90cb35-3366-44df-9238-3da82d300654/ovsdbserver-nb/0.log" Jan 31 09:50:39 crc kubenswrapper[4783]: I0131 09:50:39.074202 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bed60581-fb96-4e66-bd14-2e2c0f75a771/openstack-network-exporter/0.log" Jan 31 09:50:39 crc kubenswrapper[4783]: I0131 09:50:39.153210 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bed60581-fb96-4e66-bd14-2e2c0f75a771/ovsdbserver-sb/0.log" Jan 31 09:50:39 crc kubenswrapper[4783]: I0131 09:50:39.242219 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5df45c7c98-nt6z5_05962991-82e5-4c31-87fa-c7df3cba5f90/placement-api/0.log" Jan 31 09:50:39 crc kubenswrapper[4783]: I0131 09:50:39.335322 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5df45c7c98-nt6z5_05962991-82e5-4c31-87fa-c7df3cba5f90/placement-log/0.log" Jan 31 09:50:39 crc kubenswrapper[4783]: I0131 09:50:39.418582 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_99ac760c-1287-4674-9133-ee9124e9fbbd/setup-container/0.log" Jan 31 09:50:39 crc kubenswrapper[4783]: I0131 09:50:39.582786 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_99ac760c-1287-4674-9133-ee9124e9fbbd/setup-container/0.log" Jan 31 09:50:39 crc kubenswrapper[4783]: I0131 09:50:39.624277 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_99ac760c-1287-4674-9133-ee9124e9fbbd/rabbitmq/0.log" Jan 31 09:50:39 crc kubenswrapper[4783]: I0131 09:50:39.691751 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4902a1ee-5b54-48bd-b8fb-8be63db315a5/setup-container/0.log" Jan 31 09:50:39 crc kubenswrapper[4783]: I0131 09:50:39.826911 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4902a1ee-5b54-48bd-b8fb-8be63db315a5/setup-container/0.log" Jan 31 09:50:39 crc kubenswrapper[4783]: I0131 09:50:39.933278 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb_cf6bea6a-6877-4624-b8ac-cfd51fb514a9/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:50:39 crc kubenswrapper[4783]: I0131 09:50:39.983230 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4902a1ee-5b54-48bd-b8fb-8be63db315a5/rabbitmq/0.log" Jan 31 09:50:40 crc kubenswrapper[4783]: I0131 09:50:40.105854 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-trw4c_968fb64b-ed6e-492b-a508-41dd3dd98085/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:50:40 crc kubenswrapper[4783]: I0131 09:50:40.172799 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29_f3679d22-7479-40f7-9c8b-2e0caa156965/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:50:40 crc kubenswrapper[4783]: I0131 09:50:40.328486 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-5mpq2_35e3815e-af8f-4724-846b-ea6038002f70/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:50:40 crc kubenswrapper[4783]: I0131 09:50:40.402661 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-bmbrc_e9ce688e-1f0e-486c-b3c7-4b45243713ed/ssh-known-hosts-edpm-deployment/0.log" Jan 31 09:50:40 crc kubenswrapper[4783]: I0131 09:50:40.548631 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7846c976fc-knpz2_ff4d96b6-b227-41e4-a653-39b8475aa9de/proxy-server/0.log" Jan 31 09:50:40 crc kubenswrapper[4783]: I0131 09:50:40.581459 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7846c976fc-knpz2_ff4d96b6-b227-41e4-a653-39b8475aa9de/proxy-httpd/0.log" Jan 31 09:50:40 crc kubenswrapper[4783]: I0131 09:50:40.695386 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-58vfp_6827ccb1-8fcf-4451-a878-25d3d5765ae6/swift-ring-rebalance/0.log" Jan 31 09:50:40 crc kubenswrapper[4783]: I0131 09:50:40.818336 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/account-auditor/0.log" Jan 31 09:50:40 crc kubenswrapper[4783]: I0131 09:50:40.933201 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/account-replicator/0.log" Jan 31 09:50:40 crc kubenswrapper[4783]: I0131 09:50:40.948009 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/account-reaper/0.log" Jan 31 09:50:40 crc kubenswrapper[4783]: I0131 09:50:40.961537 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/account-server/0.log" Jan 31 09:50:41 crc kubenswrapper[4783]: I0131 09:50:41.025854 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/container-auditor/0.log" Jan 31 09:50:41 crc kubenswrapper[4783]: I0131 09:50:41.139733 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/container-updater/0.log" Jan 31 09:50:41 crc kubenswrapper[4783]: I0131 09:50:41.170307 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/container-replicator/0.log" Jan 31 09:50:41 crc kubenswrapper[4783]: I0131 09:50:41.212307 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/container-server/0.log" Jan 31 09:50:41 crc kubenswrapper[4783]: I0131 09:50:41.284281 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/object-auditor/0.log" Jan 31 09:50:41 crc kubenswrapper[4783]: I0131 09:50:41.329343 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/object-expirer/0.log" Jan 31 09:50:41 crc kubenswrapper[4783]: I0131 09:50:41.417672 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/object-replicator/0.log" Jan 31 09:50:41 crc kubenswrapper[4783]: I0131 09:50:41.454661 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/object-server/0.log" Jan 31 09:50:41 crc kubenswrapper[4783]: I0131 09:50:41.469800 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/object-updater/0.log" Jan 31 09:50:41 crc kubenswrapper[4783]: I0131 09:50:41.551848 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/rsync/0.log" Jan 31 09:50:41 crc kubenswrapper[4783]: I0131 09:50:41.603425 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/swift-recon-cron/0.log" Jan 31 09:50:41 crc kubenswrapper[4783]: I0131 09:50:41.834914 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_16292449-5a29-426e-aa57-18e752dd60f6/tempest-tests-tempest-tests-runner/0.log" Jan 31 09:50:41 crc kubenswrapper[4783]: I0131 09:50:41.840969 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g_36f447a7-72aa-465f-8fad-1c0bb7c71a9e/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:50:42 crc kubenswrapper[4783]: I0131 09:50:42.135989 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_3438915b-514e-4298-842a-c77aefe49803/test-operator-logs-container/0.log" Jan 31 09:50:42 crc kubenswrapper[4783]: I0131 09:50:42.224674 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-97pwd_1f0db336-229b-4d05-b3e9-aaa8b26b08c4/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:50:47 crc kubenswrapper[4783]: I0131 09:50:47.756535 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:50:47 crc kubenswrapper[4783]: I0131 09:50:47.757807 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:50:50 crc kubenswrapper[4783]: I0131 09:50:50.224421 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c967214f-ce54-4ac2-ae54-2d750133ff97/memcached/0.log" Jan 31 09:51:04 crc kubenswrapper[4783]: I0131 09:51:04.682913 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv_297a5b92-55db-4a84-8bd3-878ea32367df/util/0.log" Jan 31 09:51:04 crc kubenswrapper[4783]: I0131 09:51:04.832857 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv_297a5b92-55db-4a84-8bd3-878ea32367df/pull/0.log" Jan 31 09:51:04 crc kubenswrapper[4783]: I0131 09:51:04.847335 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv_297a5b92-55db-4a84-8bd3-878ea32367df/util/0.log" Jan 31 09:51:04 crc kubenswrapper[4783]: I0131 09:51:04.912359 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv_297a5b92-55db-4a84-8bd3-878ea32367df/pull/0.log" Jan 31 09:51:05 crc kubenswrapper[4783]: I0131 09:51:05.111386 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv_297a5b92-55db-4a84-8bd3-878ea32367df/extract/0.log" Jan 31 09:51:05 crc kubenswrapper[4783]: I0131 09:51:05.197074 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv_297a5b92-55db-4a84-8bd3-878ea32367df/util/0.log" Jan 31 09:51:05 crc kubenswrapper[4783]: I0131 09:51:05.213043 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv_297a5b92-55db-4a84-8bd3-878ea32367df/pull/0.log" Jan 31 09:51:05 crc kubenswrapper[4783]: I0131 09:51:05.317214 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-gwkhk_3a079322-76ea-4cb9-a8b6-3f0b1a360086/manager/0.log" Jan 31 09:51:05 crc kubenswrapper[4783]: I0131 09:51:05.405012 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-vjgpf_99914340-4708-4322-996f-7392f6fe6e02/manager/0.log" Jan 31 09:51:05 crc kubenswrapper[4783]: I0131 09:51:05.485270 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-k9948_7ddb9dd0-fc57-4685-a7d5-778a4152ea58/manager/0.log" Jan 31 09:51:05 crc kubenswrapper[4783]: I0131 09:51:05.604058 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-btfdw_e4606683-7a0b-4a0f-ae81-3c6e598a36e6/manager/0.log" Jan 31 09:51:05 crc kubenswrapper[4783]: I0131 09:51:05.677992 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-ln9h6_1e1e07c3-0aeb-47fd-be71-a13716a04f29/manager/0.log" Jan 31 09:51:05 crc kubenswrapper[4783]: I0131 09:51:05.806350 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-4nwdm_1de23cab-104f-49ac-ab9f-3b1d08733ff9/manager/0.log" Jan 31 09:51:06 crc kubenswrapper[4783]: I0131 09:51:06.004332 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-82z66_bbcb9fbf-dab0-4029-b8a8-9e6f13bdf352/manager/0.log" Jan 31 09:51:06 crc kubenswrapper[4783]: I0131 09:51:06.102725 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-rn96f_5033c800-ef69-4228-a204-b66401c4725c/manager/0.log" Jan 31 09:51:06 crc kubenswrapper[4783]: I0131 09:51:06.191714 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-t7nxg_cd28073a-c4f4-4b1d-9680-a9d5a5939deb/manager/0.log" Jan 31 09:51:06 crc kubenswrapper[4783]: I0131 09:51:06.236243 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-97h8r_8fdee142-92ef-49d1-bac6-6f6c3873b2cb/manager/0.log" Jan 31 09:51:06 crc kubenswrapper[4783]: I0131 09:51:06.378029 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-bxv5z_acb9dbfe-e754-4021-bc54-7ccd17b217a4/manager/0.log" Jan 31 09:51:06 crc kubenswrapper[4783]: I0131 09:51:06.438664 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-fvml6_be9f1345-8ca5-49da-a52e-4b841ea07ac3/manager/0.log" Jan 31 09:51:06 crc kubenswrapper[4783]: I0131 09:51:06.598844 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-tmf9x_8178283d-c10c-45e6-a465-bdb5096d8904/manager/0.log" Jan 31 09:51:06 crc kubenswrapper[4783]: I0131 09:51:06.630467 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-s842k_fa110500-5ebd-4645-86a1-e3bf4b9780fe/manager/0.log" Jan 31 09:51:06 crc kubenswrapper[4783]: I0131 09:51:06.775914 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp_10649b85-8b3a-44e2-9477-6f5821d232a7/manager/0.log" Jan 31 09:51:06 crc kubenswrapper[4783]: I0131 09:51:06.864329 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-757f46c65d-s5jcf_5cdb2757-415c-4ffe-bcb1-0c07dfeee1ab/operator/0.log" Jan 31 09:51:07 crc kubenswrapper[4783]: I0131 09:51:07.031363 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-z2jkn_474ac824-d8f5-4d2b-9b6f-c385808d57b8/registry-server/0.log" Jan 31 09:51:07 crc kubenswrapper[4783]: I0131 09:51:07.282240 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-x6rmh_5bc18646-6e2c-41b5-8690-b6b7eda1a8cc/manager/0.log" Jan 31 09:51:07 crc kubenswrapper[4783]: I0131 09:51:07.311285 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-x86p2_caf4a0bd-2f55-4185-a756-4a640cbfe8d3/manager/0.log" Jan 31 09:51:07 crc kubenswrapper[4783]: I0131 09:51:07.535694 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pfgj4_2d0fc101-8afd-4154-9741-d5d3520990fe/operator/0.log" Jan 31 09:51:07 crc kubenswrapper[4783]: I0131 09:51:07.719027 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-5hw2c_c42112c2-917b-491b-9a4a-5253a0fc8d09/manager/0.log" Jan 31 09:51:07 crc kubenswrapper[4783]: I0131 09:51:07.827548 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-64b5b76f97-8dwmb_8ad3860f-1b60-4522-8026-08212156646d/manager/0.log" Jan 31 09:51:07 crc kubenswrapper[4783]: I0131 09:51:07.920421 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6b6f655c79-jqzh7_6360a87f-2ddf-4c17-9f25-cff4e0f5e747/manager/0.log" Jan 31 09:51:07 crc kubenswrapper[4783]: I0131 09:51:07.962933 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-9d95x_d6255583-1dd4-4901-b3af-8619aa03434b/manager/0.log" Jan 31 09:51:08 crc kubenswrapper[4783]: I0131 09:51:08.118877 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-sbgw8_b4b9ea20-ea14-4c87-b40e-5767debc9f57/manager/0.log" Jan 31 09:51:17 crc kubenswrapper[4783]: I0131 09:51:17.757173 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:51:17 crc kubenswrapper[4783]: I0131 09:51:17.757717 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:51:25 crc kubenswrapper[4783]: I0131 09:51:25.271465 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-t4qqm_a3f564d8-5f07-446d-9dd1-955e39d4a5f4/control-plane-machine-set-operator/0.log" Jan 31 09:51:25 crc kubenswrapper[4783]: I0131 09:51:25.454463 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-m54sp_0e337b91-adb8-4cb7-8e5e-be2b80e78f56/machine-api-operator/0.log" Jan 31 09:51:25 crc kubenswrapper[4783]: I0131 09:51:25.473072 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-m54sp_0e337b91-adb8-4cb7-8e5e-be2b80e78f56/kube-rbac-proxy/0.log" Jan 31 09:51:36 crc kubenswrapper[4783]: I0131 09:51:36.968230 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-bvf4q_5ac717c7-d413-4765-9bf5-b0d7ad8163c6/cert-manager-controller/0.log" Jan 31 09:51:37 crc kubenswrapper[4783]: I0131 09:51:37.138330 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-mhfqz_c0721d7b-1d48-4f0f-bdba-4e2afa8cf7dd/cert-manager-cainjector/0.log" Jan 31 09:51:37 crc kubenswrapper[4783]: I0131 09:51:37.225248 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-bhlz2_db7bedab-fb68-4a6a-887c-d2aa1a63d0ee/cert-manager-webhook/0.log" Jan 31 09:51:47 crc kubenswrapper[4783]: I0131 09:51:47.756820 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:51:47 crc kubenswrapper[4783]: I0131 09:51:47.757407 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:51:47 crc kubenswrapper[4783]: I0131 09:51:47.757454 4783 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:51:47 crc kubenswrapper[4783]: I0131 09:51:47.757985 4783 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d7e158b49ce59cc2fb38a7e77af20dfdc83de86afbe948aa4710ad4b8760eef"} pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:51:47 crc kubenswrapper[4783]: I0131 09:51:47.758038 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" containerID="cri-o://3d7e158b49ce59cc2fb38a7e77af20dfdc83de86afbe948aa4710ad4b8760eef" gracePeriod=600 Jan 31 09:51:48 crc kubenswrapper[4783]: I0131 09:51:48.396195 4783 generic.go:334] "Generic (PLEG): container finished" podID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerID="3d7e158b49ce59cc2fb38a7e77af20dfdc83de86afbe948aa4710ad4b8760eef" exitCode=0 Jan 31 09:51:48 crc kubenswrapper[4783]: I0131 09:51:48.396320 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerDied","Data":"3d7e158b49ce59cc2fb38a7e77af20dfdc83de86afbe948aa4710ad4b8760eef"} Jan 31 09:51:48 crc kubenswrapper[4783]: I0131 09:51:48.396793 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerStarted","Data":"a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6"} Jan 31 09:51:48 crc kubenswrapper[4783]: I0131 09:51:48.396845 4783 scope.go:117] "RemoveContainer" containerID="6da0cb7dc67773f0235e797779d7fc1fb1165229df28540cd24ecaa36f2de2b9" Jan 31 09:51:48 crc kubenswrapper[4783]: I0131 09:51:48.543326 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-58rxc_dd570070-6aad-4b28-aefd-e4e2ce7e6a8c/nmstate-console-plugin/0.log" Jan 31 09:51:48 crc kubenswrapper[4783]: I0131 09:51:48.710215 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-vc2gq_4b90a875-3ddc-4ba4-a62a-dd83c9de4d59/nmstate-handler/0.log" Jan 31 09:51:48 crc kubenswrapper[4783]: I0131 09:51:48.756972 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-kw4vm_20903e03-98bb-4970-b9b9-9088bfbd1902/kube-rbac-proxy/0.log" Jan 31 09:51:48 crc kubenswrapper[4783]: I0131 09:51:48.838538 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-kw4vm_20903e03-98bb-4970-b9b9-9088bfbd1902/nmstate-metrics/0.log" Jan 31 09:51:48 crc kubenswrapper[4783]: I0131 09:51:48.918464 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-7nq22_66e4885b-8227-4433-8208-12dad761b627/nmstate-operator/0.log" Jan 31 09:51:48 crc kubenswrapper[4783]: I0131 09:51:48.985153 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-2cm9k_fe6808ce-fbb6-4782-831b-892b074b7267/nmstate-webhook/0.log" Jan 31 09:52:00 crc kubenswrapper[4783]: I0131 09:52:00.269445 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ch2nl"] Jan 31 09:52:00 crc kubenswrapper[4783]: E0131 09:52:00.270364 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c1406d5-2707-4f81-a41e-f25af0ca8627" containerName="container-00" Jan 31 09:52:00 crc kubenswrapper[4783]: I0131 09:52:00.270379 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c1406d5-2707-4f81-a41e-f25af0ca8627" containerName="container-00" Jan 31 09:52:00 crc kubenswrapper[4783]: I0131 09:52:00.270596 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c1406d5-2707-4f81-a41e-f25af0ca8627" containerName="container-00" Jan 31 09:52:00 crc kubenswrapper[4783]: I0131 09:52:00.272075 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:00 crc kubenswrapper[4783]: I0131 09:52:00.286581 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ch2nl"] Jan 31 09:52:00 crc kubenswrapper[4783]: I0131 09:52:00.333651 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-catalog-content\") pod \"redhat-operators-ch2nl\" (UID: \"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a\") " pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:00 crc kubenswrapper[4783]: I0131 09:52:00.333951 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbnkn\" (UniqueName: \"kubernetes.io/projected/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-kube-api-access-pbnkn\") pod \"redhat-operators-ch2nl\" (UID: \"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a\") " pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:00 crc kubenswrapper[4783]: I0131 09:52:00.334144 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-utilities\") pod \"redhat-operators-ch2nl\" (UID: \"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a\") " pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:00 crc kubenswrapper[4783]: I0131 09:52:00.437020 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbnkn\" (UniqueName: \"kubernetes.io/projected/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-kube-api-access-pbnkn\") pod \"redhat-operators-ch2nl\" (UID: \"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a\") " pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:00 crc kubenswrapper[4783]: I0131 09:52:00.437657 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-utilities\") pod \"redhat-operators-ch2nl\" (UID: \"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a\") " pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:00 crc kubenswrapper[4783]: I0131 09:52:00.437854 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-catalog-content\") pod \"redhat-operators-ch2nl\" (UID: \"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a\") " pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:00 crc kubenswrapper[4783]: I0131 09:52:00.438563 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-catalog-content\") pod \"redhat-operators-ch2nl\" (UID: \"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a\") " pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:00 crc kubenswrapper[4783]: I0131 09:52:00.438337 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-utilities\") pod \"redhat-operators-ch2nl\" (UID: \"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a\") " pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:00 crc kubenswrapper[4783]: I0131 09:52:00.458971 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbnkn\" (UniqueName: \"kubernetes.io/projected/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-kube-api-access-pbnkn\") pod \"redhat-operators-ch2nl\" (UID: \"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a\") " pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:00 crc kubenswrapper[4783]: I0131 09:52:00.588112 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:01 crc kubenswrapper[4783]: I0131 09:52:01.022376 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ch2nl"] Jan 31 09:52:01 crc kubenswrapper[4783]: I0131 09:52:01.528815 4783 generic.go:334] "Generic (PLEG): container finished" podID="f6fe6679-6272-4d9d-8b2a-56ea5d66b69a" containerID="209f57d3fd1f9d6b9e51165dbb3d9885d4fa7e2be07f4037c449a34fd388f04e" exitCode=0 Jan 31 09:52:01 crc kubenswrapper[4783]: I0131 09:52:01.528915 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ch2nl" event={"ID":"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a","Type":"ContainerDied","Data":"209f57d3fd1f9d6b9e51165dbb3d9885d4fa7e2be07f4037c449a34fd388f04e"} Jan 31 09:52:01 crc kubenswrapper[4783]: I0131 09:52:01.529122 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ch2nl" event={"ID":"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a","Type":"ContainerStarted","Data":"6bfcdc3ab53ef51f7c5467d2a92f9f2762d217277c428b8ff09ac0364c10e888"} Jan 31 09:52:02 crc kubenswrapper[4783]: I0131 09:52:02.538231 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ch2nl" event={"ID":"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a","Type":"ContainerStarted","Data":"9a622a4ff819650308cbf465a16dc3a24249b68ac01285574604cc7ebf96fb58"} Jan 31 09:52:05 crc kubenswrapper[4783]: I0131 09:52:05.564781 4783 generic.go:334] "Generic (PLEG): container finished" podID="f6fe6679-6272-4d9d-8b2a-56ea5d66b69a" containerID="9a622a4ff819650308cbf465a16dc3a24249b68ac01285574604cc7ebf96fb58" exitCode=0 Jan 31 09:52:05 crc kubenswrapper[4783]: I0131 09:52:05.564808 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ch2nl" event={"ID":"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a","Type":"ContainerDied","Data":"9a622a4ff819650308cbf465a16dc3a24249b68ac01285574604cc7ebf96fb58"} Jan 31 09:52:06 crc kubenswrapper[4783]: I0131 09:52:06.576697 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ch2nl" event={"ID":"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a","Type":"ContainerStarted","Data":"969cfc389d63656059d4e1dfbf5e68ff273be31f3a836c93fefa3646a3cf7e65"} Jan 31 09:52:06 crc kubenswrapper[4783]: I0131 09:52:06.606325 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ch2nl" podStartSLOduration=1.997322692 podStartE2EDuration="6.606305477s" podCreationTimestamp="2026-01-31 09:52:00 +0000 UTC" firstStartedPulling="2026-01-31 09:52:01.531205379 +0000 UTC m=+2832.199888848" lastFinishedPulling="2026-01-31 09:52:06.140188165 +0000 UTC m=+2836.808871633" observedRunningTime="2026-01-31 09:52:06.594407555 +0000 UTC m=+2837.263091024" watchObservedRunningTime="2026-01-31 09:52:06.606305477 +0000 UTC m=+2837.274988946" Jan 31 09:52:10 crc kubenswrapper[4783]: I0131 09:52:10.588299 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:10 crc kubenswrapper[4783]: I0131 09:52:10.588824 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:11 crc kubenswrapper[4783]: I0131 09:52:11.622404 4783 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ch2nl" podUID="f6fe6679-6272-4d9d-8b2a-56ea5d66b69a" containerName="registry-server" probeResult="failure" output=< Jan 31 09:52:11 crc kubenswrapper[4783]: timeout: failed to connect service ":50051" within 1s Jan 31 09:52:11 crc kubenswrapper[4783]: > Jan 31 09:52:12 crc kubenswrapper[4783]: I0131 09:52:12.910571 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-qcxmw_90502cfa-f884-4181-b14c-98b49f254530/kube-rbac-proxy/0.log" Jan 31 09:52:12 crc kubenswrapper[4783]: I0131 09:52:12.987723 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-qcxmw_90502cfa-f884-4181-b14c-98b49f254530/controller/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.104769 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-2q4pc_1e56e9fc-1576-4315-97b2-fa45c03bb8ca/frr-k8s-webhook-server/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.192069 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-frr-files/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.322967 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-frr-files/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.330618 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-reloader/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.342758 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-reloader/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.344728 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-metrics/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.489495 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-metrics/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.492245 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-metrics/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.493071 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-frr-files/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.526960 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-reloader/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.661386 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-reloader/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.683805 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-metrics/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.691941 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-frr-files/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.697419 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/controller/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.831549 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/frr-metrics/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.863290 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/kube-rbac-proxy/0.log" Jan 31 09:52:13 crc kubenswrapper[4783]: I0131 09:52:13.882482 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/kube-rbac-proxy-frr/0.log" Jan 31 09:52:14 crc kubenswrapper[4783]: I0131 09:52:14.075832 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/reloader/0.log" Jan 31 09:52:14 crc kubenswrapper[4783]: I0131 09:52:14.095597 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84cd58cb5d-hkgvw_db52c704-6d54-4f49-9168-903b12ed4a25/manager/0.log" Jan 31 09:52:14 crc kubenswrapper[4783]: I0131 09:52:14.290547 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6cbf6c4975-7nlfs_9392ad71-70f9-4727-baaa-68ddfa6b3361/webhook-server/0.log" Jan 31 09:52:14 crc kubenswrapper[4783]: I0131 09:52:14.498818 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hn6z7_9d4ef83f-da80-4f86-8e3f-6618d9bd5c44/kube-rbac-proxy/0.log" Jan 31 09:52:14 crc kubenswrapper[4783]: I0131 09:52:14.853139 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hn6z7_9d4ef83f-da80-4f86-8e3f-6618d9bd5c44/speaker/0.log" Jan 31 09:52:15 crc kubenswrapper[4783]: I0131 09:52:15.017247 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/frr/0.log" Jan 31 09:52:20 crc kubenswrapper[4783]: I0131 09:52:20.630516 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:20 crc kubenswrapper[4783]: I0131 09:52:20.672146 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:20 crc kubenswrapper[4783]: I0131 09:52:20.868067 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ch2nl"] Jan 31 09:52:21 crc kubenswrapper[4783]: I0131 09:52:21.706908 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ch2nl" podUID="f6fe6679-6272-4d9d-8b2a-56ea5d66b69a" containerName="registry-server" containerID="cri-o://969cfc389d63656059d4e1dfbf5e68ff273be31f3a836c93fefa3646a3cf7e65" gracePeriod=2 Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.091682 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.293484 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbnkn\" (UniqueName: \"kubernetes.io/projected/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-kube-api-access-pbnkn\") pod \"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a\" (UID: \"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a\") " Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.293642 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-utilities\") pod \"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a\" (UID: \"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a\") " Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.293714 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-catalog-content\") pod \"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a\" (UID: \"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a\") " Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.294329 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-utilities" (OuterVolumeSpecName: "utilities") pod "f6fe6679-6272-4d9d-8b2a-56ea5d66b69a" (UID: "f6fe6679-6272-4d9d-8b2a-56ea5d66b69a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.294919 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.298904 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-kube-api-access-pbnkn" (OuterVolumeSpecName: "kube-api-access-pbnkn") pod "f6fe6679-6272-4d9d-8b2a-56ea5d66b69a" (UID: "f6fe6679-6272-4d9d-8b2a-56ea5d66b69a"). InnerVolumeSpecName "kube-api-access-pbnkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.384888 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6fe6679-6272-4d9d-8b2a-56ea5d66b69a" (UID: "f6fe6679-6272-4d9d-8b2a-56ea5d66b69a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.397246 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbnkn\" (UniqueName: \"kubernetes.io/projected/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-kube-api-access-pbnkn\") on node \"crc\" DevicePath \"\"" Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.397280 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.716469 4783 generic.go:334] "Generic (PLEG): container finished" podID="f6fe6679-6272-4d9d-8b2a-56ea5d66b69a" containerID="969cfc389d63656059d4e1dfbf5e68ff273be31f3a836c93fefa3646a3cf7e65" exitCode=0 Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.716530 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ch2nl" event={"ID":"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a","Type":"ContainerDied","Data":"969cfc389d63656059d4e1dfbf5e68ff273be31f3a836c93fefa3646a3cf7e65"} Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.716569 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ch2nl" Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.716585 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ch2nl" event={"ID":"f6fe6679-6272-4d9d-8b2a-56ea5d66b69a","Type":"ContainerDied","Data":"6bfcdc3ab53ef51f7c5467d2a92f9f2762d217277c428b8ff09ac0364c10e888"} Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.716609 4783 scope.go:117] "RemoveContainer" containerID="969cfc389d63656059d4e1dfbf5e68ff273be31f3a836c93fefa3646a3cf7e65" Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.741647 4783 scope.go:117] "RemoveContainer" containerID="9a622a4ff819650308cbf465a16dc3a24249b68ac01285574604cc7ebf96fb58" Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.755319 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ch2nl"] Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.768613 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ch2nl"] Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.771104 4783 scope.go:117] "RemoveContainer" containerID="209f57d3fd1f9d6b9e51165dbb3d9885d4fa7e2be07f4037c449a34fd388f04e" Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.799481 4783 scope.go:117] "RemoveContainer" containerID="969cfc389d63656059d4e1dfbf5e68ff273be31f3a836c93fefa3646a3cf7e65" Jan 31 09:52:22 crc kubenswrapper[4783]: E0131 09:52:22.799879 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"969cfc389d63656059d4e1dfbf5e68ff273be31f3a836c93fefa3646a3cf7e65\": container with ID starting with 969cfc389d63656059d4e1dfbf5e68ff273be31f3a836c93fefa3646a3cf7e65 not found: ID does not exist" containerID="969cfc389d63656059d4e1dfbf5e68ff273be31f3a836c93fefa3646a3cf7e65" Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.799918 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"969cfc389d63656059d4e1dfbf5e68ff273be31f3a836c93fefa3646a3cf7e65"} err="failed to get container status \"969cfc389d63656059d4e1dfbf5e68ff273be31f3a836c93fefa3646a3cf7e65\": rpc error: code = NotFound desc = could not find container \"969cfc389d63656059d4e1dfbf5e68ff273be31f3a836c93fefa3646a3cf7e65\": container with ID starting with 969cfc389d63656059d4e1dfbf5e68ff273be31f3a836c93fefa3646a3cf7e65 not found: ID does not exist" Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.799948 4783 scope.go:117] "RemoveContainer" containerID="9a622a4ff819650308cbf465a16dc3a24249b68ac01285574604cc7ebf96fb58" Jan 31 09:52:22 crc kubenswrapper[4783]: E0131 09:52:22.800280 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a622a4ff819650308cbf465a16dc3a24249b68ac01285574604cc7ebf96fb58\": container with ID starting with 9a622a4ff819650308cbf465a16dc3a24249b68ac01285574604cc7ebf96fb58 not found: ID does not exist" containerID="9a622a4ff819650308cbf465a16dc3a24249b68ac01285574604cc7ebf96fb58" Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.800303 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a622a4ff819650308cbf465a16dc3a24249b68ac01285574604cc7ebf96fb58"} err="failed to get container status \"9a622a4ff819650308cbf465a16dc3a24249b68ac01285574604cc7ebf96fb58\": rpc error: code = NotFound desc = could not find container \"9a622a4ff819650308cbf465a16dc3a24249b68ac01285574604cc7ebf96fb58\": container with ID starting with 9a622a4ff819650308cbf465a16dc3a24249b68ac01285574604cc7ebf96fb58 not found: ID does not exist" Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.800320 4783 scope.go:117] "RemoveContainer" containerID="209f57d3fd1f9d6b9e51165dbb3d9885d4fa7e2be07f4037c449a34fd388f04e" Jan 31 09:52:22 crc kubenswrapper[4783]: E0131 09:52:22.800579 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"209f57d3fd1f9d6b9e51165dbb3d9885d4fa7e2be07f4037c449a34fd388f04e\": container with ID starting with 209f57d3fd1f9d6b9e51165dbb3d9885d4fa7e2be07f4037c449a34fd388f04e not found: ID does not exist" containerID="209f57d3fd1f9d6b9e51165dbb3d9885d4fa7e2be07f4037c449a34fd388f04e" Jan 31 09:52:22 crc kubenswrapper[4783]: I0131 09:52:22.800613 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"209f57d3fd1f9d6b9e51165dbb3d9885d4fa7e2be07f4037c449a34fd388f04e"} err="failed to get container status \"209f57d3fd1f9d6b9e51165dbb3d9885d4fa7e2be07f4037c449a34fd388f04e\": rpc error: code = NotFound desc = could not find container \"209f57d3fd1f9d6b9e51165dbb3d9885d4fa7e2be07f4037c449a34fd388f04e\": container with ID starting with 209f57d3fd1f9d6b9e51165dbb3d9885d4fa7e2be07f4037c449a34fd388f04e not found: ID does not exist" Jan 31 09:52:23 crc kubenswrapper[4783]: I0131 09:52:23.656239 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6fe6679-6272-4d9d-8b2a-56ea5d66b69a" path="/var/lib/kubelet/pods/f6fe6679-6272-4d9d-8b2a-56ea5d66b69a/volumes" Jan 31 09:52:25 crc kubenswrapper[4783]: I0131 09:52:25.917989 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch_a8892df4-c6f7-42b1-b003-c7d359c74690/util/0.log" Jan 31 09:52:26 crc kubenswrapper[4783]: I0131 09:52:26.062387 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch_a8892df4-c6f7-42b1-b003-c7d359c74690/util/0.log" Jan 31 09:52:26 crc kubenswrapper[4783]: I0131 09:52:26.089828 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch_a8892df4-c6f7-42b1-b003-c7d359c74690/pull/0.log" Jan 31 09:52:26 crc kubenswrapper[4783]: I0131 09:52:26.117024 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch_a8892df4-c6f7-42b1-b003-c7d359c74690/pull/0.log" Jan 31 09:52:26 crc kubenswrapper[4783]: I0131 09:52:26.247459 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch_a8892df4-c6f7-42b1-b003-c7d359c74690/pull/0.log" Jan 31 09:52:26 crc kubenswrapper[4783]: I0131 09:52:26.288191 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch_a8892df4-c6f7-42b1-b003-c7d359c74690/extract/0.log" Jan 31 09:52:26 crc kubenswrapper[4783]: I0131 09:52:26.296405 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch_a8892df4-c6f7-42b1-b003-c7d359c74690/util/0.log" Jan 31 09:52:26 crc kubenswrapper[4783]: I0131 09:52:26.409627 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv_a19e1f79-11c0-430c-9bea-96c2878fde55/util/0.log" Jan 31 09:52:26 crc kubenswrapper[4783]: I0131 09:52:26.570501 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv_a19e1f79-11c0-430c-9bea-96c2878fde55/util/0.log" Jan 31 09:52:26 crc kubenswrapper[4783]: I0131 09:52:26.578806 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv_a19e1f79-11c0-430c-9bea-96c2878fde55/pull/0.log" Jan 31 09:52:26 crc kubenswrapper[4783]: I0131 09:52:26.602974 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv_a19e1f79-11c0-430c-9bea-96c2878fde55/pull/0.log" Jan 31 09:52:26 crc kubenswrapper[4783]: I0131 09:52:26.730835 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv_a19e1f79-11c0-430c-9bea-96c2878fde55/util/0.log" Jan 31 09:52:26 crc kubenswrapper[4783]: I0131 09:52:26.736137 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv_a19e1f79-11c0-430c-9bea-96c2878fde55/extract/0.log" Jan 31 09:52:26 crc kubenswrapper[4783]: I0131 09:52:26.742725 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv_a19e1f79-11c0-430c-9bea-96c2878fde55/pull/0.log" Jan 31 09:52:26 crc kubenswrapper[4783]: I0131 09:52:26.896568 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5dqb2_fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2/extract-utilities/0.log" Jan 31 09:52:27 crc kubenswrapper[4783]: I0131 09:52:27.056927 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5dqb2_fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2/extract-content/0.log" Jan 31 09:52:27 crc kubenswrapper[4783]: I0131 09:52:27.069072 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5dqb2_fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2/extract-content/0.log" Jan 31 09:52:27 crc kubenswrapper[4783]: I0131 09:52:27.081708 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5dqb2_fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2/extract-utilities/0.log" Jan 31 09:52:27 crc kubenswrapper[4783]: I0131 09:52:27.228910 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5dqb2_fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2/extract-content/0.log" Jan 31 09:52:27 crc kubenswrapper[4783]: I0131 09:52:27.258294 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5dqb2_fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2/extract-utilities/0.log" Jan 31 09:52:27 crc kubenswrapper[4783]: I0131 09:52:27.437973 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-729jp_2b509e05-1b13-486b-8986-6a343c3110b8/extract-utilities/0.log" Jan 31 09:52:27 crc kubenswrapper[4783]: I0131 09:52:27.545318 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5dqb2_fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2/registry-server/0.log" Jan 31 09:52:27 crc kubenswrapper[4783]: I0131 09:52:27.660465 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-729jp_2b509e05-1b13-486b-8986-6a343c3110b8/extract-utilities/0.log" Jan 31 09:52:27 crc kubenswrapper[4783]: I0131 09:52:27.675350 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-729jp_2b509e05-1b13-486b-8986-6a343c3110b8/extract-content/0.log" Jan 31 09:52:27 crc kubenswrapper[4783]: I0131 09:52:27.685111 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-729jp_2b509e05-1b13-486b-8986-6a343c3110b8/extract-content/0.log" Jan 31 09:52:27 crc kubenswrapper[4783]: I0131 09:52:27.820489 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-729jp_2b509e05-1b13-486b-8986-6a343c3110b8/extract-utilities/0.log" Jan 31 09:52:27 crc kubenswrapper[4783]: I0131 09:52:27.854504 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-729jp_2b509e05-1b13-486b-8986-6a343c3110b8/extract-content/0.log" Jan 31 09:52:28 crc kubenswrapper[4783]: I0131 09:52:28.038921 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-729tq_6e7fa19e-aa64-4479-805e-62625ccc19b8/marketplace-operator/0.log" Jan 31 09:52:28 crc kubenswrapper[4783]: I0131 09:52:28.072403 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-729tq_6e7fa19e-aa64-4479-805e-62625ccc19b8/marketplace-operator/1.log" Jan 31 09:52:28 crc kubenswrapper[4783]: I0131 09:52:28.170996 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-729jp_2b509e05-1b13-486b-8986-6a343c3110b8/registry-server/0.log" Jan 31 09:52:28 crc kubenswrapper[4783]: I0131 09:52:28.223256 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xm4z9_c7cddcda-dcff-4c7a-b437-2bd3750b9200/extract-utilities/0.log" Jan 31 09:52:28 crc kubenswrapper[4783]: I0131 09:52:28.359227 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xm4z9_c7cddcda-dcff-4c7a-b437-2bd3750b9200/extract-content/0.log" Jan 31 09:52:28 crc kubenswrapper[4783]: I0131 09:52:28.376580 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xm4z9_c7cddcda-dcff-4c7a-b437-2bd3750b9200/extract-content/0.log" Jan 31 09:52:28 crc kubenswrapper[4783]: I0131 09:52:28.379118 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xm4z9_c7cddcda-dcff-4c7a-b437-2bd3750b9200/extract-utilities/0.log" Jan 31 09:52:28 crc kubenswrapper[4783]: I0131 09:52:28.529243 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xm4z9_c7cddcda-dcff-4c7a-b437-2bd3750b9200/extract-utilities/0.log" Jan 31 09:52:28 crc kubenswrapper[4783]: I0131 09:52:28.576451 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xm4z9_c7cddcda-dcff-4c7a-b437-2bd3750b9200/extract-content/0.log" Jan 31 09:52:28 crc kubenswrapper[4783]: I0131 09:52:28.641096 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xm4z9_c7cddcda-dcff-4c7a-b437-2bd3750b9200/registry-server/0.log" Jan 31 09:52:28 crc kubenswrapper[4783]: I0131 09:52:28.728251 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4fjj8_86ccd026-7c4f-4a84-8baa-45cfafa1abba/extract-utilities/0.log" Jan 31 09:52:28 crc kubenswrapper[4783]: I0131 09:52:28.858721 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4fjj8_86ccd026-7c4f-4a84-8baa-45cfafa1abba/extract-content/0.log" Jan 31 09:52:28 crc kubenswrapper[4783]: I0131 09:52:28.888893 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4fjj8_86ccd026-7c4f-4a84-8baa-45cfafa1abba/extract-utilities/0.log" Jan 31 09:52:28 crc kubenswrapper[4783]: I0131 09:52:28.923527 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4fjj8_86ccd026-7c4f-4a84-8baa-45cfafa1abba/extract-content/0.log" Jan 31 09:52:29 crc kubenswrapper[4783]: I0131 09:52:29.024556 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4fjj8_86ccd026-7c4f-4a84-8baa-45cfafa1abba/extract-utilities/0.log" Jan 31 09:52:29 crc kubenswrapper[4783]: I0131 09:52:29.032895 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4fjj8_86ccd026-7c4f-4a84-8baa-45cfafa1abba/extract-content/0.log" Jan 31 09:52:29 crc kubenswrapper[4783]: I0131 09:52:29.380936 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4fjj8_86ccd026-7c4f-4a84-8baa-45cfafa1abba/registry-server/0.log" Jan 31 09:52:53 crc kubenswrapper[4783]: E0131 09:52:53.822204 4783 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.26.246:36076->192.168.26.246:36733: write tcp 192.168.26.246:36076->192.168.26.246:36733: write: broken pipe Jan 31 09:53:51 crc kubenswrapper[4783]: I0131 09:53:51.469912 4783 generic.go:334] "Generic (PLEG): container finished" podID="a7771c91-54d0-4aec-aacf-267513a0eea2" containerID="6b1b6ba4969b810c9d93a1977b178a89eca60ef39a705ab887b25ca3ad2b640d" exitCode=0 Jan 31 09:53:51 crc kubenswrapper[4783]: I0131 09:53:51.469936 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5rgv/must-gather-l9j56" event={"ID":"a7771c91-54d0-4aec-aacf-267513a0eea2","Type":"ContainerDied","Data":"6b1b6ba4969b810c9d93a1977b178a89eca60ef39a705ab887b25ca3ad2b640d"} Jan 31 09:53:51 crc kubenswrapper[4783]: I0131 09:53:51.471429 4783 scope.go:117] "RemoveContainer" containerID="6b1b6ba4969b810c9d93a1977b178a89eca60ef39a705ab887b25ca3ad2b640d" Jan 31 09:53:51 crc kubenswrapper[4783]: I0131 09:53:51.707705 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f5rgv_must-gather-l9j56_a7771c91-54d0-4aec-aacf-267513a0eea2/gather/0.log" Jan 31 09:53:59 crc kubenswrapper[4783]: I0131 09:53:59.215752 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f5rgv/must-gather-l9j56"] Jan 31 09:53:59 crc kubenswrapper[4783]: I0131 09:53:59.216671 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-f5rgv/must-gather-l9j56" podUID="a7771c91-54d0-4aec-aacf-267513a0eea2" containerName="copy" containerID="cri-o://3914445ec00f61c2d5da25f7bce90a71fbc916f5864f19f7200f04a4d61c77a2" gracePeriod=2 Jan 31 09:53:59 crc kubenswrapper[4783]: I0131 09:53:59.233033 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f5rgv/must-gather-l9j56"] Jan 31 09:53:59 crc kubenswrapper[4783]: I0131 09:53:59.541393 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f5rgv_must-gather-l9j56_a7771c91-54d0-4aec-aacf-267513a0eea2/copy/0.log" Jan 31 09:53:59 crc kubenswrapper[4783]: I0131 09:53:59.546832 4783 generic.go:334] "Generic (PLEG): container finished" podID="a7771c91-54d0-4aec-aacf-267513a0eea2" containerID="3914445ec00f61c2d5da25f7bce90a71fbc916f5864f19f7200f04a4d61c77a2" exitCode=143 Jan 31 09:53:59 crc kubenswrapper[4783]: I0131 09:53:59.546921 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2a0fd8f4904dd8e23da05a469c0dc030b487c09cf69f6a826eff53ce542ed63" Jan 31 09:53:59 crc kubenswrapper[4783]: I0131 09:53:59.590696 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f5rgv_must-gather-l9j56_a7771c91-54d0-4aec-aacf-267513a0eea2/copy/0.log" Jan 31 09:53:59 crc kubenswrapper[4783]: I0131 09:53:59.591077 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5rgv/must-gather-l9j56" Jan 31 09:53:59 crc kubenswrapper[4783]: I0131 09:53:59.762247 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn8j5\" (UniqueName: \"kubernetes.io/projected/a7771c91-54d0-4aec-aacf-267513a0eea2-kube-api-access-xn8j5\") pod \"a7771c91-54d0-4aec-aacf-267513a0eea2\" (UID: \"a7771c91-54d0-4aec-aacf-267513a0eea2\") " Jan 31 09:53:59 crc kubenswrapper[4783]: I0131 09:53:59.762521 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7771c91-54d0-4aec-aacf-267513a0eea2-must-gather-output\") pod \"a7771c91-54d0-4aec-aacf-267513a0eea2\" (UID: \"a7771c91-54d0-4aec-aacf-267513a0eea2\") " Jan 31 09:53:59 crc kubenswrapper[4783]: I0131 09:53:59.768924 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7771c91-54d0-4aec-aacf-267513a0eea2-kube-api-access-xn8j5" (OuterVolumeSpecName: "kube-api-access-xn8j5") pod "a7771c91-54d0-4aec-aacf-267513a0eea2" (UID: "a7771c91-54d0-4aec-aacf-267513a0eea2"). InnerVolumeSpecName "kube-api-access-xn8j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:53:59 crc kubenswrapper[4783]: I0131 09:53:59.866567 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn8j5\" (UniqueName: \"kubernetes.io/projected/a7771c91-54d0-4aec-aacf-267513a0eea2-kube-api-access-xn8j5\") on node \"crc\" DevicePath \"\"" Jan 31 09:53:59 crc kubenswrapper[4783]: I0131 09:53:59.916799 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7771c91-54d0-4aec-aacf-267513a0eea2-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a7771c91-54d0-4aec-aacf-267513a0eea2" (UID: "a7771c91-54d0-4aec-aacf-267513a0eea2"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:53:59 crc kubenswrapper[4783]: I0131 09:53:59.969331 4783 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7771c91-54d0-4aec-aacf-267513a0eea2-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 31 09:54:00 crc kubenswrapper[4783]: I0131 09:54:00.553732 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5rgv/must-gather-l9j56" Jan 31 09:54:01 crc kubenswrapper[4783]: I0131 09:54:01.656659 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7771c91-54d0-4aec-aacf-267513a0eea2" path="/var/lib/kubelet/pods/a7771c91-54d0-4aec-aacf-267513a0eea2/volumes" Jan 31 09:54:17 crc kubenswrapper[4783]: I0131 09:54:17.757011 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:54:17 crc kubenswrapper[4783]: I0131 09:54:17.757616 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:54:47 crc kubenswrapper[4783]: I0131 09:54:47.756755 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:54:47 crc kubenswrapper[4783]: I0131 09:54:47.757406 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:55:17 crc kubenswrapper[4783]: I0131 09:55:17.756552 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:55:17 crc kubenswrapper[4783]: I0131 09:55:17.757326 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:55:17 crc kubenswrapper[4783]: I0131 09:55:17.757370 4783 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" Jan 31 09:55:17 crc kubenswrapper[4783]: I0131 09:55:17.757871 4783 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6"} pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:55:17 crc kubenswrapper[4783]: I0131 09:55:17.757927 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" containerID="cri-o://a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" gracePeriod=600 Jan 31 09:55:17 crc kubenswrapper[4783]: E0131 09:55:17.875926 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:55:18 crc kubenswrapper[4783]: I0131 09:55:18.278233 4783 generic.go:334] "Generic (PLEG): container finished" podID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" exitCode=0 Jan 31 09:55:18 crc kubenswrapper[4783]: I0131 09:55:18.278288 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerDied","Data":"a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6"} Jan 31 09:55:18 crc kubenswrapper[4783]: I0131 09:55:18.278339 4783 scope.go:117] "RemoveContainer" containerID="3d7e158b49ce59cc2fb38a7e77af20dfdc83de86afbe948aa4710ad4b8760eef" Jan 31 09:55:18 crc kubenswrapper[4783]: I0131 09:55:18.278728 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:55:18 crc kubenswrapper[4783]: E0131 09:55:18.279072 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:55:32 crc kubenswrapper[4783]: I0131 09:55:32.647269 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:55:32 crc kubenswrapper[4783]: E0131 09:55:32.648115 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:55:44 crc kubenswrapper[4783]: I0131 09:55:44.646422 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:55:44 crc kubenswrapper[4783]: E0131 09:55:44.647544 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:55:59 crc kubenswrapper[4783]: I0131 09:55:59.655554 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:55:59 crc kubenswrapper[4783]: E0131 09:55:59.656957 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:56:01 crc kubenswrapper[4783]: I0131 09:56:01.278113 4783 scope.go:117] "RemoveContainer" containerID="3914445ec00f61c2d5da25f7bce90a71fbc916f5864f19f7200f04a4d61c77a2" Jan 31 09:56:01 crc kubenswrapper[4783]: I0131 09:56:01.303247 4783 scope.go:117] "RemoveContainer" containerID="6b1b6ba4969b810c9d93a1977b178a89eca60ef39a705ab887b25ca3ad2b640d" Jan 31 09:56:01 crc kubenswrapper[4783]: I0131 09:56:01.370008 4783 scope.go:117] "RemoveContainer" containerID="c3ae5158febf577f33113a8f5df4077fc04ea665e47e4a7d534ff23ca583a06a" Jan 31 09:56:13 crc kubenswrapper[4783]: I0131 09:56:13.646352 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:56:13 crc kubenswrapper[4783]: E0131 09:56:13.647238 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.090400 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c8zqw/must-gather-fxhdm"] Jan 31 09:56:18 crc kubenswrapper[4783]: E0131 09:56:18.091499 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7771c91-54d0-4aec-aacf-267513a0eea2" containerName="gather" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.091517 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7771c91-54d0-4aec-aacf-267513a0eea2" containerName="gather" Jan 31 09:56:18 crc kubenswrapper[4783]: E0131 09:56:18.091537 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7771c91-54d0-4aec-aacf-267513a0eea2" containerName="copy" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.091543 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7771c91-54d0-4aec-aacf-267513a0eea2" containerName="copy" Jan 31 09:56:18 crc kubenswrapper[4783]: E0131 09:56:18.091564 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6fe6679-6272-4d9d-8b2a-56ea5d66b69a" containerName="registry-server" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.091571 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6fe6679-6272-4d9d-8b2a-56ea5d66b69a" containerName="registry-server" Jan 31 09:56:18 crc kubenswrapper[4783]: E0131 09:56:18.091586 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6fe6679-6272-4d9d-8b2a-56ea5d66b69a" containerName="extract-content" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.091591 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6fe6679-6272-4d9d-8b2a-56ea5d66b69a" containerName="extract-content" Jan 31 09:56:18 crc kubenswrapper[4783]: E0131 09:56:18.091611 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6fe6679-6272-4d9d-8b2a-56ea5d66b69a" containerName="extract-utilities" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.091617 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6fe6679-6272-4d9d-8b2a-56ea5d66b69a" containerName="extract-utilities" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.091832 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7771c91-54d0-4aec-aacf-267513a0eea2" containerName="gather" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.091848 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6fe6679-6272-4d9d-8b2a-56ea5d66b69a" containerName="registry-server" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.091875 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7771c91-54d0-4aec-aacf-267513a0eea2" containerName="copy" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.092963 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c8zqw/must-gather-fxhdm" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.094966 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-c8zqw"/"kube-root-ca.crt" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.095395 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-c8zqw"/"openshift-service-ca.crt" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.099865 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-c8zqw/must-gather-fxhdm"] Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.220332 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2eab12dc-dad2-43be-9823-6a868e74f9a0-must-gather-output\") pod \"must-gather-fxhdm\" (UID: \"2eab12dc-dad2-43be-9823-6a868e74f9a0\") " pod="openshift-must-gather-c8zqw/must-gather-fxhdm" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.220736 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwgpz\" (UniqueName: \"kubernetes.io/projected/2eab12dc-dad2-43be-9823-6a868e74f9a0-kube-api-access-zwgpz\") pod \"must-gather-fxhdm\" (UID: \"2eab12dc-dad2-43be-9823-6a868e74f9a0\") " pod="openshift-must-gather-c8zqw/must-gather-fxhdm" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.323049 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwgpz\" (UniqueName: \"kubernetes.io/projected/2eab12dc-dad2-43be-9823-6a868e74f9a0-kube-api-access-zwgpz\") pod \"must-gather-fxhdm\" (UID: \"2eab12dc-dad2-43be-9823-6a868e74f9a0\") " pod="openshift-must-gather-c8zqw/must-gather-fxhdm" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.323128 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2eab12dc-dad2-43be-9823-6a868e74f9a0-must-gather-output\") pod \"must-gather-fxhdm\" (UID: \"2eab12dc-dad2-43be-9823-6a868e74f9a0\") " pod="openshift-must-gather-c8zqw/must-gather-fxhdm" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.323613 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2eab12dc-dad2-43be-9823-6a868e74f9a0-must-gather-output\") pod \"must-gather-fxhdm\" (UID: \"2eab12dc-dad2-43be-9823-6a868e74f9a0\") " pod="openshift-must-gather-c8zqw/must-gather-fxhdm" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.339785 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwgpz\" (UniqueName: \"kubernetes.io/projected/2eab12dc-dad2-43be-9823-6a868e74f9a0-kube-api-access-zwgpz\") pod \"must-gather-fxhdm\" (UID: \"2eab12dc-dad2-43be-9823-6a868e74f9a0\") " pod="openshift-must-gather-c8zqw/must-gather-fxhdm" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.408422 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c8zqw/must-gather-fxhdm" Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.822467 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-c8zqw/must-gather-fxhdm"] Jan 31 09:56:18 crc kubenswrapper[4783]: I0131 09:56:18.853983 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c8zqw/must-gather-fxhdm" event={"ID":"2eab12dc-dad2-43be-9823-6a868e74f9a0","Type":"ContainerStarted","Data":"1c285c2ce69e50d158f415eaf51662d62255d3c5af14effbeeaafb3e9b7b4e5e"} Jan 31 09:56:19 crc kubenswrapper[4783]: I0131 09:56:19.864945 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c8zqw/must-gather-fxhdm" event={"ID":"2eab12dc-dad2-43be-9823-6a868e74f9a0","Type":"ContainerStarted","Data":"27aa26442f30696e4407e07f8176023dde9011920e2735d65ba90d5d8ce0ae78"} Jan 31 09:56:19 crc kubenswrapper[4783]: I0131 09:56:19.865415 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c8zqw/must-gather-fxhdm" event={"ID":"2eab12dc-dad2-43be-9823-6a868e74f9a0","Type":"ContainerStarted","Data":"0023ccdb4626cadbe827315480b86721ff80debe53267fc3af009699d8ac3015"} Jan 31 09:56:19 crc kubenswrapper[4783]: I0131 09:56:19.890460 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-c8zqw/must-gather-fxhdm" podStartSLOduration=1.890437462 podStartE2EDuration="1.890437462s" podCreationTimestamp="2026-01-31 09:56:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:56:19.878559888 +0000 UTC m=+3090.547243356" watchObservedRunningTime="2026-01-31 09:56:19.890437462 +0000 UTC m=+3090.559120919" Jan 31 09:56:21 crc kubenswrapper[4783]: I0131 09:56:21.934835 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2n957"] Jan 31 09:56:21 crc kubenswrapper[4783]: I0131 09:56:21.936880 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:21 crc kubenswrapper[4783]: I0131 09:56:21.951154 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2n957"] Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.116521 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv5cf\" (UniqueName: \"kubernetes.io/projected/d55f08e6-5c83-4d41-8fa6-434f20cb562a-kube-api-access-qv5cf\") pod \"certified-operators-2n957\" (UID: \"d55f08e6-5c83-4d41-8fa6-434f20cb562a\") " pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.116599 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d55f08e6-5c83-4d41-8fa6-434f20cb562a-catalog-content\") pod \"certified-operators-2n957\" (UID: \"d55f08e6-5c83-4d41-8fa6-434f20cb562a\") " pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.116638 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d55f08e6-5c83-4d41-8fa6-434f20cb562a-utilities\") pod \"certified-operators-2n957\" (UID: \"d55f08e6-5c83-4d41-8fa6-434f20cb562a\") " pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.219266 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv5cf\" (UniqueName: \"kubernetes.io/projected/d55f08e6-5c83-4d41-8fa6-434f20cb562a-kube-api-access-qv5cf\") pod \"certified-operators-2n957\" (UID: \"d55f08e6-5c83-4d41-8fa6-434f20cb562a\") " pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.219432 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d55f08e6-5c83-4d41-8fa6-434f20cb562a-catalog-content\") pod \"certified-operators-2n957\" (UID: \"d55f08e6-5c83-4d41-8fa6-434f20cb562a\") " pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.219516 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d55f08e6-5c83-4d41-8fa6-434f20cb562a-utilities\") pod \"certified-operators-2n957\" (UID: \"d55f08e6-5c83-4d41-8fa6-434f20cb562a\") " pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.220046 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d55f08e6-5c83-4d41-8fa6-434f20cb562a-catalog-content\") pod \"certified-operators-2n957\" (UID: \"d55f08e6-5c83-4d41-8fa6-434f20cb562a\") " pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.220084 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d55f08e6-5c83-4d41-8fa6-434f20cb562a-utilities\") pod \"certified-operators-2n957\" (UID: \"d55f08e6-5c83-4d41-8fa6-434f20cb562a\") " pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.247506 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv5cf\" (UniqueName: \"kubernetes.io/projected/d55f08e6-5c83-4d41-8fa6-434f20cb562a-kube-api-access-qv5cf\") pod \"certified-operators-2n957\" (UID: \"d55f08e6-5c83-4d41-8fa6-434f20cb562a\") " pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.271610 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.456868 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c8zqw/crc-debug-78zct"] Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.478960 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c8zqw/crc-debug-78zct" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.481805 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-c8zqw"/"default-dockercfg-d2zch" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.546042 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk2vq\" (UniqueName: \"kubernetes.io/projected/1aa7f66a-afc2-49b6-b69b-6a639c8a40be-kube-api-access-mk2vq\") pod \"crc-debug-78zct\" (UID: \"1aa7f66a-afc2-49b6-b69b-6a639c8a40be\") " pod="openshift-must-gather-c8zqw/crc-debug-78zct" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.546086 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1aa7f66a-afc2-49b6-b69b-6a639c8a40be-host\") pod \"crc-debug-78zct\" (UID: \"1aa7f66a-afc2-49b6-b69b-6a639c8a40be\") " pod="openshift-must-gather-c8zqw/crc-debug-78zct" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.649397 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk2vq\" (UniqueName: \"kubernetes.io/projected/1aa7f66a-afc2-49b6-b69b-6a639c8a40be-kube-api-access-mk2vq\") pod \"crc-debug-78zct\" (UID: \"1aa7f66a-afc2-49b6-b69b-6a639c8a40be\") " pod="openshift-must-gather-c8zqw/crc-debug-78zct" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.649458 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1aa7f66a-afc2-49b6-b69b-6a639c8a40be-host\") pod \"crc-debug-78zct\" (UID: \"1aa7f66a-afc2-49b6-b69b-6a639c8a40be\") " pod="openshift-must-gather-c8zqw/crc-debug-78zct" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.649597 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1aa7f66a-afc2-49b6-b69b-6a639c8a40be-host\") pod \"crc-debug-78zct\" (UID: \"1aa7f66a-afc2-49b6-b69b-6a639c8a40be\") " pod="openshift-must-gather-c8zqw/crc-debug-78zct" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.674100 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk2vq\" (UniqueName: \"kubernetes.io/projected/1aa7f66a-afc2-49b6-b69b-6a639c8a40be-kube-api-access-mk2vq\") pod \"crc-debug-78zct\" (UID: \"1aa7f66a-afc2-49b6-b69b-6a639c8a40be\") " pod="openshift-must-gather-c8zqw/crc-debug-78zct" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.683125 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2n957"] Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.831194 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c8zqw/crc-debug-78zct" Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.917283 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c8zqw/crc-debug-78zct" event={"ID":"1aa7f66a-afc2-49b6-b69b-6a639c8a40be","Type":"ContainerStarted","Data":"2929557f1f3ec9c559d232b11b6378a50d4e27466b490edfcda999a803dad1ec"} Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.928302 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n957" event={"ID":"d55f08e6-5c83-4d41-8fa6-434f20cb562a","Type":"ContainerStarted","Data":"8f835cbe2053718a7836b6d5160f6bbc68b78501cf342713698809fde74ff13c"} Jan 31 09:56:22 crc kubenswrapper[4783]: I0131 09:56:22.928436 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n957" event={"ID":"d55f08e6-5c83-4d41-8fa6-434f20cb562a","Type":"ContainerStarted","Data":"57790fdb72d603b3fe4aae1e132d6be32aa4d2b54c63400b8e5a8cb8e6e86b3b"} Jan 31 09:56:23 crc kubenswrapper[4783]: I0131 09:56:23.942144 4783 generic.go:334] "Generic (PLEG): container finished" podID="d55f08e6-5c83-4d41-8fa6-434f20cb562a" containerID="8f835cbe2053718a7836b6d5160f6bbc68b78501cf342713698809fde74ff13c" exitCode=0 Jan 31 09:56:23 crc kubenswrapper[4783]: I0131 09:56:23.942403 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n957" event={"ID":"d55f08e6-5c83-4d41-8fa6-434f20cb562a","Type":"ContainerDied","Data":"8f835cbe2053718a7836b6d5160f6bbc68b78501cf342713698809fde74ff13c"} Jan 31 09:56:23 crc kubenswrapper[4783]: I0131 09:56:23.945442 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c8zqw/crc-debug-78zct" event={"ID":"1aa7f66a-afc2-49b6-b69b-6a639c8a40be","Type":"ContainerStarted","Data":"6cc836bdfdf9ad1da72bacd2bf1f3a97bb87c9d01d218ff1c9ef04734b7634e1"} Jan 31 09:56:23 crc kubenswrapper[4783]: I0131 09:56:23.947436 4783 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:56:23 crc kubenswrapper[4783]: I0131 09:56:23.981921 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-c8zqw/crc-debug-78zct" podStartSLOduration=1.981900301 podStartE2EDuration="1.981900301s" podCreationTimestamp="2026-01-31 09:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:56:23.975913294 +0000 UTC m=+3094.644596761" watchObservedRunningTime="2026-01-31 09:56:23.981900301 +0000 UTC m=+3094.650583769" Jan 31 09:56:24 crc kubenswrapper[4783]: I0131 09:56:24.647170 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:56:24 crc kubenswrapper[4783]: E0131 09:56:24.647859 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:56:24 crc kubenswrapper[4783]: I0131 09:56:24.955650 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n957" event={"ID":"d55f08e6-5c83-4d41-8fa6-434f20cb562a","Type":"ContainerStarted","Data":"e5a41dad093f895fabaf73bde5c5f343715541ecc039b8aa73c848b4d0a4eb99"} Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.131666 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jkrmz"] Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.133843 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.146314 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkrmz"] Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.210760 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/575b313d-09df-4c50-8ab1-ba9498566347-catalog-content\") pod \"redhat-marketplace-jkrmz\" (UID: \"575b313d-09df-4c50-8ab1-ba9498566347\") " pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.210891 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/575b313d-09df-4c50-8ab1-ba9498566347-utilities\") pod \"redhat-marketplace-jkrmz\" (UID: \"575b313d-09df-4c50-8ab1-ba9498566347\") " pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.211001 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxjfk\" (UniqueName: \"kubernetes.io/projected/575b313d-09df-4c50-8ab1-ba9498566347-kube-api-access-xxjfk\") pod \"redhat-marketplace-jkrmz\" (UID: \"575b313d-09df-4c50-8ab1-ba9498566347\") " pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.313418 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxjfk\" (UniqueName: \"kubernetes.io/projected/575b313d-09df-4c50-8ab1-ba9498566347-kube-api-access-xxjfk\") pod \"redhat-marketplace-jkrmz\" (UID: \"575b313d-09df-4c50-8ab1-ba9498566347\") " pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.313823 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/575b313d-09df-4c50-8ab1-ba9498566347-catalog-content\") pod \"redhat-marketplace-jkrmz\" (UID: \"575b313d-09df-4c50-8ab1-ba9498566347\") " pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.314019 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/575b313d-09df-4c50-8ab1-ba9498566347-utilities\") pod \"redhat-marketplace-jkrmz\" (UID: \"575b313d-09df-4c50-8ab1-ba9498566347\") " pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.314358 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/575b313d-09df-4c50-8ab1-ba9498566347-catalog-content\") pod \"redhat-marketplace-jkrmz\" (UID: \"575b313d-09df-4c50-8ab1-ba9498566347\") " pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.314408 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/575b313d-09df-4c50-8ab1-ba9498566347-utilities\") pod \"redhat-marketplace-jkrmz\" (UID: \"575b313d-09df-4c50-8ab1-ba9498566347\") " pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.333806 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxjfk\" (UniqueName: \"kubernetes.io/projected/575b313d-09df-4c50-8ab1-ba9498566347-kube-api-access-xxjfk\") pod \"redhat-marketplace-jkrmz\" (UID: \"575b313d-09df-4c50-8ab1-ba9498566347\") " pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.450752 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.941434 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkrmz"] Jan 31 09:56:25 crc kubenswrapper[4783]: W0131 09:56:25.950098 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod575b313d_09df_4c50_8ab1_ba9498566347.slice/crio-646f58391af086a91fcd04bf0dba1c68f91c4f6a3c78166bdaf4a545581b0a3d WatchSource:0}: Error finding container 646f58391af086a91fcd04bf0dba1c68f91c4f6a3c78166bdaf4a545581b0a3d: Status 404 returned error can't find the container with id 646f58391af086a91fcd04bf0dba1c68f91c4f6a3c78166bdaf4a545581b0a3d Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.965322 4783 generic.go:334] "Generic (PLEG): container finished" podID="d55f08e6-5c83-4d41-8fa6-434f20cb562a" containerID="e5a41dad093f895fabaf73bde5c5f343715541ecc039b8aa73c848b4d0a4eb99" exitCode=0 Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.965373 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n957" event={"ID":"d55f08e6-5c83-4d41-8fa6-434f20cb562a","Type":"ContainerDied","Data":"e5a41dad093f895fabaf73bde5c5f343715541ecc039b8aa73c848b4d0a4eb99"} Jan 31 09:56:25 crc kubenswrapper[4783]: I0131 09:56:25.968367 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkrmz" event={"ID":"575b313d-09df-4c50-8ab1-ba9498566347","Type":"ContainerStarted","Data":"646f58391af086a91fcd04bf0dba1c68f91c4f6a3c78166bdaf4a545581b0a3d"} Jan 31 09:56:26 crc kubenswrapper[4783]: I0131 09:56:26.979040 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n957" event={"ID":"d55f08e6-5c83-4d41-8fa6-434f20cb562a","Type":"ContainerStarted","Data":"fd972a14bff7c2bb3af956c302c2c0fd9002742423f9bad3cb0666fbf75b65ba"} Jan 31 09:56:26 crc kubenswrapper[4783]: I0131 09:56:26.986841 4783 generic.go:334] "Generic (PLEG): container finished" podID="575b313d-09df-4c50-8ab1-ba9498566347" containerID="127f47c2351ba161850711a3f1c19764399697ee2c250ff3e968b94ea1ec6fa0" exitCode=0 Jan 31 09:56:26 crc kubenswrapper[4783]: I0131 09:56:26.986878 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkrmz" event={"ID":"575b313d-09df-4c50-8ab1-ba9498566347","Type":"ContainerDied","Data":"127f47c2351ba161850711a3f1c19764399697ee2c250ff3e968b94ea1ec6fa0"} Jan 31 09:56:27 crc kubenswrapper[4783]: I0131 09:56:27.003506 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2n957" podStartSLOduration=3.4983794550000002 podStartE2EDuration="6.003490216s" podCreationTimestamp="2026-01-31 09:56:21 +0000 UTC" firstStartedPulling="2026-01-31 09:56:23.947134786 +0000 UTC m=+3094.615818255" lastFinishedPulling="2026-01-31 09:56:26.452245548 +0000 UTC m=+3097.120929016" observedRunningTime="2026-01-31 09:56:26.998494948 +0000 UTC m=+3097.667178406" watchObservedRunningTime="2026-01-31 09:56:27.003490216 +0000 UTC m=+3097.672173674" Jan 31 09:56:27 crc kubenswrapper[4783]: I0131 09:56:27.996647 4783 generic.go:334] "Generic (PLEG): container finished" podID="575b313d-09df-4c50-8ab1-ba9498566347" containerID="f324b192697d2b4e6e433187dd84b8e478924b553f3621f3ba084b4a14cbcd51" exitCode=0 Jan 31 09:56:27 crc kubenswrapper[4783]: I0131 09:56:27.996742 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkrmz" event={"ID":"575b313d-09df-4c50-8ab1-ba9498566347","Type":"ContainerDied","Data":"f324b192697d2b4e6e433187dd84b8e478924b553f3621f3ba084b4a14cbcd51"} Jan 31 09:56:29 crc kubenswrapper[4783]: I0131 09:56:29.006965 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkrmz" event={"ID":"575b313d-09df-4c50-8ab1-ba9498566347","Type":"ContainerStarted","Data":"926d81885bab38715c82f880cc2f96c0bd713c7ad4590836cc57b6b2b75e3dc8"} Jan 31 09:56:29 crc kubenswrapper[4783]: I0131 09:56:29.030134 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jkrmz" podStartSLOduration=2.57168938 podStartE2EDuration="4.030111578s" podCreationTimestamp="2026-01-31 09:56:25 +0000 UTC" firstStartedPulling="2026-01-31 09:56:26.988766878 +0000 UTC m=+3097.657450345" lastFinishedPulling="2026-01-31 09:56:28.447189075 +0000 UTC m=+3099.115872543" observedRunningTime="2026-01-31 09:56:29.026642707 +0000 UTC m=+3099.695326174" watchObservedRunningTime="2026-01-31 09:56:29.030111578 +0000 UTC m=+3099.698795046" Jan 31 09:56:32 crc kubenswrapper[4783]: I0131 09:56:32.273121 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:32 crc kubenswrapper[4783]: I0131 09:56:32.273730 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:32 crc kubenswrapper[4783]: I0131 09:56:32.312697 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:33 crc kubenswrapper[4783]: I0131 09:56:33.070835 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:33 crc kubenswrapper[4783]: I0131 09:56:33.114508 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2n957"] Jan 31 09:56:35 crc kubenswrapper[4783]: I0131 09:56:35.047931 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2n957" podUID="d55f08e6-5c83-4d41-8fa6-434f20cb562a" containerName="registry-server" containerID="cri-o://fd972a14bff7c2bb3af956c302c2c0fd9002742423f9bad3cb0666fbf75b65ba" gracePeriod=2 Jan 31 09:56:35 crc kubenswrapper[4783]: I0131 09:56:35.443637 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:35 crc kubenswrapper[4783]: I0131 09:56:35.451229 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:35 crc kubenswrapper[4783]: I0131 09:56:35.451301 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:35 crc kubenswrapper[4783]: I0131 09:56:35.497343 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:35 crc kubenswrapper[4783]: I0131 09:56:35.634702 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv5cf\" (UniqueName: \"kubernetes.io/projected/d55f08e6-5c83-4d41-8fa6-434f20cb562a-kube-api-access-qv5cf\") pod \"d55f08e6-5c83-4d41-8fa6-434f20cb562a\" (UID: \"d55f08e6-5c83-4d41-8fa6-434f20cb562a\") " Jan 31 09:56:35 crc kubenswrapper[4783]: I0131 09:56:35.635025 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d55f08e6-5c83-4d41-8fa6-434f20cb562a-catalog-content\") pod \"d55f08e6-5c83-4d41-8fa6-434f20cb562a\" (UID: \"d55f08e6-5c83-4d41-8fa6-434f20cb562a\") " Jan 31 09:56:35 crc kubenswrapper[4783]: I0131 09:56:35.635091 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d55f08e6-5c83-4d41-8fa6-434f20cb562a-utilities\") pod \"d55f08e6-5c83-4d41-8fa6-434f20cb562a\" (UID: \"d55f08e6-5c83-4d41-8fa6-434f20cb562a\") " Jan 31 09:56:35 crc kubenswrapper[4783]: I0131 09:56:35.635806 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d55f08e6-5c83-4d41-8fa6-434f20cb562a-utilities" (OuterVolumeSpecName: "utilities") pod "d55f08e6-5c83-4d41-8fa6-434f20cb562a" (UID: "d55f08e6-5c83-4d41-8fa6-434f20cb562a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:56:35 crc kubenswrapper[4783]: I0131 09:56:35.639955 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d55f08e6-5c83-4d41-8fa6-434f20cb562a-kube-api-access-qv5cf" (OuterVolumeSpecName: "kube-api-access-qv5cf") pod "d55f08e6-5c83-4d41-8fa6-434f20cb562a" (UID: "d55f08e6-5c83-4d41-8fa6-434f20cb562a"). InnerVolumeSpecName "kube-api-access-qv5cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:56:35 crc kubenswrapper[4783]: I0131 09:56:35.676448 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d55f08e6-5c83-4d41-8fa6-434f20cb562a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d55f08e6-5c83-4d41-8fa6-434f20cb562a" (UID: "d55f08e6-5c83-4d41-8fa6-434f20cb562a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:56:35 crc kubenswrapper[4783]: I0131 09:56:35.737616 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv5cf\" (UniqueName: \"kubernetes.io/projected/d55f08e6-5c83-4d41-8fa6-434f20cb562a-kube-api-access-qv5cf\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:35 crc kubenswrapper[4783]: I0131 09:56:35.737649 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d55f08e6-5c83-4d41-8fa6-434f20cb562a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:35 crc kubenswrapper[4783]: I0131 09:56:35.737659 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d55f08e6-5c83-4d41-8fa6-434f20cb562a-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:36 crc kubenswrapper[4783]: I0131 09:56:36.060344 4783 generic.go:334] "Generic (PLEG): container finished" podID="d55f08e6-5c83-4d41-8fa6-434f20cb562a" containerID="fd972a14bff7c2bb3af956c302c2c0fd9002742423f9bad3cb0666fbf75b65ba" exitCode=0 Jan 31 09:56:36 crc kubenswrapper[4783]: I0131 09:56:36.060420 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n957" event={"ID":"d55f08e6-5c83-4d41-8fa6-434f20cb562a","Type":"ContainerDied","Data":"fd972a14bff7c2bb3af956c302c2c0fd9002742423f9bad3cb0666fbf75b65ba"} Jan 31 09:56:36 crc kubenswrapper[4783]: I0131 09:56:36.060479 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2n957" event={"ID":"d55f08e6-5c83-4d41-8fa6-434f20cb562a","Type":"ContainerDied","Data":"57790fdb72d603b3fe4aae1e132d6be32aa4d2b54c63400b8e5a8cb8e6e86b3b"} Jan 31 09:56:36 crc kubenswrapper[4783]: I0131 09:56:36.060506 4783 scope.go:117] "RemoveContainer" containerID="fd972a14bff7c2bb3af956c302c2c0fd9002742423f9bad3cb0666fbf75b65ba" Jan 31 09:56:36 crc kubenswrapper[4783]: I0131 09:56:36.061436 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2n957" Jan 31 09:56:36 crc kubenswrapper[4783]: I0131 09:56:36.091034 4783 scope.go:117] "RemoveContainer" containerID="e5a41dad093f895fabaf73bde5c5f343715541ecc039b8aa73c848b4d0a4eb99" Jan 31 09:56:36 crc kubenswrapper[4783]: I0131 09:56:36.096748 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2n957"] Jan 31 09:56:36 crc kubenswrapper[4783]: I0131 09:56:36.109760 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2n957"] Jan 31 09:56:36 crc kubenswrapper[4783]: I0131 09:56:36.122663 4783 scope.go:117] "RemoveContainer" containerID="8f835cbe2053718a7836b6d5160f6bbc68b78501cf342713698809fde74ff13c" Jan 31 09:56:36 crc kubenswrapper[4783]: I0131 09:56:36.129762 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:36 crc kubenswrapper[4783]: I0131 09:56:36.168713 4783 scope.go:117] "RemoveContainer" containerID="fd972a14bff7c2bb3af956c302c2c0fd9002742423f9bad3cb0666fbf75b65ba" Jan 31 09:56:36 crc kubenswrapper[4783]: E0131 09:56:36.169227 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd972a14bff7c2bb3af956c302c2c0fd9002742423f9bad3cb0666fbf75b65ba\": container with ID starting with fd972a14bff7c2bb3af956c302c2c0fd9002742423f9bad3cb0666fbf75b65ba not found: ID does not exist" containerID="fd972a14bff7c2bb3af956c302c2c0fd9002742423f9bad3cb0666fbf75b65ba" Jan 31 09:56:36 crc kubenswrapper[4783]: I0131 09:56:36.169285 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd972a14bff7c2bb3af956c302c2c0fd9002742423f9bad3cb0666fbf75b65ba"} err="failed to get container status \"fd972a14bff7c2bb3af956c302c2c0fd9002742423f9bad3cb0666fbf75b65ba\": rpc error: code = NotFound desc = could not find container \"fd972a14bff7c2bb3af956c302c2c0fd9002742423f9bad3cb0666fbf75b65ba\": container with ID starting with fd972a14bff7c2bb3af956c302c2c0fd9002742423f9bad3cb0666fbf75b65ba not found: ID does not exist" Jan 31 09:56:36 crc kubenswrapper[4783]: I0131 09:56:36.169313 4783 scope.go:117] "RemoveContainer" containerID="e5a41dad093f895fabaf73bde5c5f343715541ecc039b8aa73c848b4d0a4eb99" Jan 31 09:56:36 crc kubenswrapper[4783]: E0131 09:56:36.169591 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a41dad093f895fabaf73bde5c5f343715541ecc039b8aa73c848b4d0a4eb99\": container with ID starting with e5a41dad093f895fabaf73bde5c5f343715541ecc039b8aa73c848b4d0a4eb99 not found: ID does not exist" containerID="e5a41dad093f895fabaf73bde5c5f343715541ecc039b8aa73c848b4d0a4eb99" Jan 31 09:56:36 crc kubenswrapper[4783]: I0131 09:56:36.169614 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a41dad093f895fabaf73bde5c5f343715541ecc039b8aa73c848b4d0a4eb99"} err="failed to get container status \"e5a41dad093f895fabaf73bde5c5f343715541ecc039b8aa73c848b4d0a4eb99\": rpc error: code = NotFound desc = could not find container \"e5a41dad093f895fabaf73bde5c5f343715541ecc039b8aa73c848b4d0a4eb99\": container with ID starting with e5a41dad093f895fabaf73bde5c5f343715541ecc039b8aa73c848b4d0a4eb99 not found: ID does not exist" Jan 31 09:56:36 crc kubenswrapper[4783]: I0131 09:56:36.169631 4783 scope.go:117] "RemoveContainer" containerID="8f835cbe2053718a7836b6d5160f6bbc68b78501cf342713698809fde74ff13c" Jan 31 09:56:36 crc kubenswrapper[4783]: E0131 09:56:36.169932 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f835cbe2053718a7836b6d5160f6bbc68b78501cf342713698809fde74ff13c\": container with ID starting with 8f835cbe2053718a7836b6d5160f6bbc68b78501cf342713698809fde74ff13c not found: ID does not exist" containerID="8f835cbe2053718a7836b6d5160f6bbc68b78501cf342713698809fde74ff13c" Jan 31 09:56:36 crc kubenswrapper[4783]: I0131 09:56:36.169972 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f835cbe2053718a7836b6d5160f6bbc68b78501cf342713698809fde74ff13c"} err="failed to get container status \"8f835cbe2053718a7836b6d5160f6bbc68b78501cf342713698809fde74ff13c\": rpc error: code = NotFound desc = could not find container \"8f835cbe2053718a7836b6d5160f6bbc68b78501cf342713698809fde74ff13c\": container with ID starting with 8f835cbe2053718a7836b6d5160f6bbc68b78501cf342713698809fde74ff13c not found: ID does not exist" Jan 31 09:56:37 crc kubenswrapper[4783]: I0131 09:56:37.645906 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:56:37 crc kubenswrapper[4783]: E0131 09:56:37.646637 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:56:37 crc kubenswrapper[4783]: I0131 09:56:37.655258 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d55f08e6-5c83-4d41-8fa6-434f20cb562a" path="/var/lib/kubelet/pods/d55f08e6-5c83-4d41-8fa6-434f20cb562a/volumes" Jan 31 09:56:38 crc kubenswrapper[4783]: I0131 09:56:38.121781 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkrmz"] Jan 31 09:56:38 crc kubenswrapper[4783]: I0131 09:56:38.122063 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jkrmz" podUID="575b313d-09df-4c50-8ab1-ba9498566347" containerName="registry-server" containerID="cri-o://926d81885bab38715c82f880cc2f96c0bd713c7ad4590836cc57b6b2b75e3dc8" gracePeriod=2 Jan 31 09:56:38 crc kubenswrapper[4783]: I0131 09:56:38.526633 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:38 crc kubenswrapper[4783]: I0131 09:56:38.604028 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/575b313d-09df-4c50-8ab1-ba9498566347-catalog-content\") pod \"575b313d-09df-4c50-8ab1-ba9498566347\" (UID: \"575b313d-09df-4c50-8ab1-ba9498566347\") " Jan 31 09:56:38 crc kubenswrapper[4783]: I0131 09:56:38.604261 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxjfk\" (UniqueName: \"kubernetes.io/projected/575b313d-09df-4c50-8ab1-ba9498566347-kube-api-access-xxjfk\") pod \"575b313d-09df-4c50-8ab1-ba9498566347\" (UID: \"575b313d-09df-4c50-8ab1-ba9498566347\") " Jan 31 09:56:38 crc kubenswrapper[4783]: I0131 09:56:38.604472 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/575b313d-09df-4c50-8ab1-ba9498566347-utilities\") pod \"575b313d-09df-4c50-8ab1-ba9498566347\" (UID: \"575b313d-09df-4c50-8ab1-ba9498566347\") " Jan 31 09:56:38 crc kubenswrapper[4783]: I0131 09:56:38.605784 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/575b313d-09df-4c50-8ab1-ba9498566347-utilities" (OuterVolumeSpecName: "utilities") pod "575b313d-09df-4c50-8ab1-ba9498566347" (UID: "575b313d-09df-4c50-8ab1-ba9498566347"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:56:38 crc kubenswrapper[4783]: I0131 09:56:38.611387 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/575b313d-09df-4c50-8ab1-ba9498566347-kube-api-access-xxjfk" (OuterVolumeSpecName: "kube-api-access-xxjfk") pod "575b313d-09df-4c50-8ab1-ba9498566347" (UID: "575b313d-09df-4c50-8ab1-ba9498566347"). InnerVolumeSpecName "kube-api-access-xxjfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:56:38 crc kubenswrapper[4783]: I0131 09:56:38.625362 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/575b313d-09df-4c50-8ab1-ba9498566347-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "575b313d-09df-4c50-8ab1-ba9498566347" (UID: "575b313d-09df-4c50-8ab1-ba9498566347"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:56:38 crc kubenswrapper[4783]: I0131 09:56:38.707448 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxjfk\" (UniqueName: \"kubernetes.io/projected/575b313d-09df-4c50-8ab1-ba9498566347-kube-api-access-xxjfk\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:38 crc kubenswrapper[4783]: I0131 09:56:38.708685 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/575b313d-09df-4c50-8ab1-ba9498566347-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:38 crc kubenswrapper[4783]: I0131 09:56:38.708968 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/575b313d-09df-4c50-8ab1-ba9498566347-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:39 crc kubenswrapper[4783]: I0131 09:56:39.088360 4783 generic.go:334] "Generic (PLEG): container finished" podID="575b313d-09df-4c50-8ab1-ba9498566347" containerID="926d81885bab38715c82f880cc2f96c0bd713c7ad4590836cc57b6b2b75e3dc8" exitCode=0 Jan 31 09:56:39 crc kubenswrapper[4783]: I0131 09:56:39.088565 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkrmz" event={"ID":"575b313d-09df-4c50-8ab1-ba9498566347","Type":"ContainerDied","Data":"926d81885bab38715c82f880cc2f96c0bd713c7ad4590836cc57b6b2b75e3dc8"} Jan 31 09:56:39 crc kubenswrapper[4783]: I0131 09:56:39.088643 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkrmz" event={"ID":"575b313d-09df-4c50-8ab1-ba9498566347","Type":"ContainerDied","Data":"646f58391af086a91fcd04bf0dba1c68f91c4f6a3c78166bdaf4a545581b0a3d"} Jan 31 09:56:39 crc kubenswrapper[4783]: I0131 09:56:39.088724 4783 scope.go:117] "RemoveContainer" containerID="926d81885bab38715c82f880cc2f96c0bd713c7ad4590836cc57b6b2b75e3dc8" Jan 31 09:56:39 crc kubenswrapper[4783]: I0131 09:56:39.088882 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jkrmz" Jan 31 09:56:39 crc kubenswrapper[4783]: I0131 09:56:39.106892 4783 scope.go:117] "RemoveContainer" containerID="f324b192697d2b4e6e433187dd84b8e478924b553f3621f3ba084b4a14cbcd51" Jan 31 09:56:39 crc kubenswrapper[4783]: I0131 09:56:39.115230 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkrmz"] Jan 31 09:56:39 crc kubenswrapper[4783]: I0131 09:56:39.122760 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkrmz"] Jan 31 09:56:39 crc kubenswrapper[4783]: I0131 09:56:39.136263 4783 scope.go:117] "RemoveContainer" containerID="127f47c2351ba161850711a3f1c19764399697ee2c250ff3e968b94ea1ec6fa0" Jan 31 09:56:39 crc kubenswrapper[4783]: I0131 09:56:39.161224 4783 scope.go:117] "RemoveContainer" containerID="926d81885bab38715c82f880cc2f96c0bd713c7ad4590836cc57b6b2b75e3dc8" Jan 31 09:56:39 crc kubenswrapper[4783]: E0131 09:56:39.161645 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"926d81885bab38715c82f880cc2f96c0bd713c7ad4590836cc57b6b2b75e3dc8\": container with ID starting with 926d81885bab38715c82f880cc2f96c0bd713c7ad4590836cc57b6b2b75e3dc8 not found: ID does not exist" containerID="926d81885bab38715c82f880cc2f96c0bd713c7ad4590836cc57b6b2b75e3dc8" Jan 31 09:56:39 crc kubenswrapper[4783]: I0131 09:56:39.161734 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"926d81885bab38715c82f880cc2f96c0bd713c7ad4590836cc57b6b2b75e3dc8"} err="failed to get container status \"926d81885bab38715c82f880cc2f96c0bd713c7ad4590836cc57b6b2b75e3dc8\": rpc error: code = NotFound desc = could not find container \"926d81885bab38715c82f880cc2f96c0bd713c7ad4590836cc57b6b2b75e3dc8\": container with ID starting with 926d81885bab38715c82f880cc2f96c0bd713c7ad4590836cc57b6b2b75e3dc8 not found: ID does not exist" Jan 31 09:56:39 crc kubenswrapper[4783]: I0131 09:56:39.161805 4783 scope.go:117] "RemoveContainer" containerID="f324b192697d2b4e6e433187dd84b8e478924b553f3621f3ba084b4a14cbcd51" Jan 31 09:56:39 crc kubenswrapper[4783]: E0131 09:56:39.162538 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f324b192697d2b4e6e433187dd84b8e478924b553f3621f3ba084b4a14cbcd51\": container with ID starting with f324b192697d2b4e6e433187dd84b8e478924b553f3621f3ba084b4a14cbcd51 not found: ID does not exist" containerID="f324b192697d2b4e6e433187dd84b8e478924b553f3621f3ba084b4a14cbcd51" Jan 31 09:56:39 crc kubenswrapper[4783]: I0131 09:56:39.162644 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f324b192697d2b4e6e433187dd84b8e478924b553f3621f3ba084b4a14cbcd51"} err="failed to get container status \"f324b192697d2b4e6e433187dd84b8e478924b553f3621f3ba084b4a14cbcd51\": rpc error: code = NotFound desc = could not find container \"f324b192697d2b4e6e433187dd84b8e478924b553f3621f3ba084b4a14cbcd51\": container with ID starting with f324b192697d2b4e6e433187dd84b8e478924b553f3621f3ba084b4a14cbcd51 not found: ID does not exist" Jan 31 09:56:39 crc kubenswrapper[4783]: I0131 09:56:39.162708 4783 scope.go:117] "RemoveContainer" containerID="127f47c2351ba161850711a3f1c19764399697ee2c250ff3e968b94ea1ec6fa0" Jan 31 09:56:39 crc kubenswrapper[4783]: E0131 09:56:39.163135 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"127f47c2351ba161850711a3f1c19764399697ee2c250ff3e968b94ea1ec6fa0\": container with ID starting with 127f47c2351ba161850711a3f1c19764399697ee2c250ff3e968b94ea1ec6fa0 not found: ID does not exist" containerID="127f47c2351ba161850711a3f1c19764399697ee2c250ff3e968b94ea1ec6fa0" Jan 31 09:56:39 crc kubenswrapper[4783]: I0131 09:56:39.163194 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"127f47c2351ba161850711a3f1c19764399697ee2c250ff3e968b94ea1ec6fa0"} err="failed to get container status \"127f47c2351ba161850711a3f1c19764399697ee2c250ff3e968b94ea1ec6fa0\": rpc error: code = NotFound desc = could not find container \"127f47c2351ba161850711a3f1c19764399697ee2c250ff3e968b94ea1ec6fa0\": container with ID starting with 127f47c2351ba161850711a3f1c19764399697ee2c250ff3e968b94ea1ec6fa0 not found: ID does not exist" Jan 31 09:56:39 crc kubenswrapper[4783]: I0131 09:56:39.655082 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="575b313d-09df-4c50-8ab1-ba9498566347" path="/var/lib/kubelet/pods/575b313d-09df-4c50-8ab1-ba9498566347/volumes" Jan 31 09:56:49 crc kubenswrapper[4783]: I0131 09:56:49.170232 4783 generic.go:334] "Generic (PLEG): container finished" podID="1aa7f66a-afc2-49b6-b69b-6a639c8a40be" containerID="6cc836bdfdf9ad1da72bacd2bf1f3a97bb87c9d01d218ff1c9ef04734b7634e1" exitCode=0 Jan 31 09:56:49 crc kubenswrapper[4783]: I0131 09:56:49.170286 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c8zqw/crc-debug-78zct" event={"ID":"1aa7f66a-afc2-49b6-b69b-6a639c8a40be","Type":"ContainerDied","Data":"6cc836bdfdf9ad1da72bacd2bf1f3a97bb87c9d01d218ff1c9ef04734b7634e1"} Jan 31 09:56:50 crc kubenswrapper[4783]: I0131 09:56:50.278384 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c8zqw/crc-debug-78zct" Jan 31 09:56:50 crc kubenswrapper[4783]: I0131 09:56:50.305900 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c8zqw/crc-debug-78zct"] Jan 31 09:56:50 crc kubenswrapper[4783]: I0131 09:56:50.311928 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c8zqw/crc-debug-78zct"] Jan 31 09:56:50 crc kubenswrapper[4783]: I0131 09:56:50.350882 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1aa7f66a-afc2-49b6-b69b-6a639c8a40be-host\") pod \"1aa7f66a-afc2-49b6-b69b-6a639c8a40be\" (UID: \"1aa7f66a-afc2-49b6-b69b-6a639c8a40be\") " Jan 31 09:56:50 crc kubenswrapper[4783]: I0131 09:56:50.350995 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1aa7f66a-afc2-49b6-b69b-6a639c8a40be-host" (OuterVolumeSpecName: "host") pod "1aa7f66a-afc2-49b6-b69b-6a639c8a40be" (UID: "1aa7f66a-afc2-49b6-b69b-6a639c8a40be"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:56:50 crc kubenswrapper[4783]: I0131 09:56:50.351147 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk2vq\" (UniqueName: \"kubernetes.io/projected/1aa7f66a-afc2-49b6-b69b-6a639c8a40be-kube-api-access-mk2vq\") pod \"1aa7f66a-afc2-49b6-b69b-6a639c8a40be\" (UID: \"1aa7f66a-afc2-49b6-b69b-6a639c8a40be\") " Jan 31 09:56:50 crc kubenswrapper[4783]: I0131 09:56:50.351853 4783 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1aa7f66a-afc2-49b6-b69b-6a639c8a40be-host\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:50 crc kubenswrapper[4783]: I0131 09:56:50.361656 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa7f66a-afc2-49b6-b69b-6a639c8a40be-kube-api-access-mk2vq" (OuterVolumeSpecName: "kube-api-access-mk2vq") pod "1aa7f66a-afc2-49b6-b69b-6a639c8a40be" (UID: "1aa7f66a-afc2-49b6-b69b-6a639c8a40be"). InnerVolumeSpecName "kube-api-access-mk2vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:56:50 crc kubenswrapper[4783]: I0131 09:56:50.453623 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk2vq\" (UniqueName: \"kubernetes.io/projected/1aa7f66a-afc2-49b6-b69b-6a639c8a40be-kube-api-access-mk2vq\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.187180 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2929557f1f3ec9c559d232b11b6378a50d4e27466b490edfcda999a803dad1ec" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.187231 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c8zqw/crc-debug-78zct" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.490835 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c8zqw/crc-debug-hhqbs"] Jan 31 09:56:51 crc kubenswrapper[4783]: E0131 09:56:51.491157 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575b313d-09df-4c50-8ab1-ba9498566347" containerName="registry-server" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.491193 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="575b313d-09df-4c50-8ab1-ba9498566347" containerName="registry-server" Jan 31 09:56:51 crc kubenswrapper[4783]: E0131 09:56:51.491207 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575b313d-09df-4c50-8ab1-ba9498566347" containerName="extract-utilities" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.491213 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="575b313d-09df-4c50-8ab1-ba9498566347" containerName="extract-utilities" Jan 31 09:56:51 crc kubenswrapper[4783]: E0131 09:56:51.491224 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d55f08e6-5c83-4d41-8fa6-434f20cb562a" containerName="extract-content" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.491231 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55f08e6-5c83-4d41-8fa6-434f20cb562a" containerName="extract-content" Jan 31 09:56:51 crc kubenswrapper[4783]: E0131 09:56:51.491238 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575b313d-09df-4c50-8ab1-ba9498566347" containerName="extract-content" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.491245 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="575b313d-09df-4c50-8ab1-ba9498566347" containerName="extract-content" Jan 31 09:56:51 crc kubenswrapper[4783]: E0131 09:56:51.491263 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d55f08e6-5c83-4d41-8fa6-434f20cb562a" containerName="registry-server" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.491268 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55f08e6-5c83-4d41-8fa6-434f20cb562a" containerName="registry-server" Jan 31 09:56:51 crc kubenswrapper[4783]: E0131 09:56:51.491284 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa7f66a-afc2-49b6-b69b-6a639c8a40be" containerName="container-00" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.491289 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa7f66a-afc2-49b6-b69b-6a639c8a40be" containerName="container-00" Jan 31 09:56:51 crc kubenswrapper[4783]: E0131 09:56:51.491301 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d55f08e6-5c83-4d41-8fa6-434f20cb562a" containerName="extract-utilities" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.491306 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55f08e6-5c83-4d41-8fa6-434f20cb562a" containerName="extract-utilities" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.491480 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="d55f08e6-5c83-4d41-8fa6-434f20cb562a" containerName="registry-server" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.491495 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="575b313d-09df-4c50-8ab1-ba9498566347" containerName="registry-server" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.491514 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa7f66a-afc2-49b6-b69b-6a639c8a40be" containerName="container-00" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.492094 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c8zqw/crc-debug-hhqbs" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.494424 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-c8zqw"/"default-dockercfg-d2zch" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.569332 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnkkk\" (UniqueName: \"kubernetes.io/projected/f1fe9a56-3d56-4a0e-9de2-e72af85ab29e-kube-api-access-fnkkk\") pod \"crc-debug-hhqbs\" (UID: \"f1fe9a56-3d56-4a0e-9de2-e72af85ab29e\") " pod="openshift-must-gather-c8zqw/crc-debug-hhqbs" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.569442 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1fe9a56-3d56-4a0e-9de2-e72af85ab29e-host\") pod \"crc-debug-hhqbs\" (UID: \"f1fe9a56-3d56-4a0e-9de2-e72af85ab29e\") " pod="openshift-must-gather-c8zqw/crc-debug-hhqbs" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.653664 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa7f66a-afc2-49b6-b69b-6a639c8a40be" path="/var/lib/kubelet/pods/1aa7f66a-afc2-49b6-b69b-6a639c8a40be/volumes" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.670585 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1fe9a56-3d56-4a0e-9de2-e72af85ab29e-host\") pod \"crc-debug-hhqbs\" (UID: \"f1fe9a56-3d56-4a0e-9de2-e72af85ab29e\") " pod="openshift-must-gather-c8zqw/crc-debug-hhqbs" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.670675 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1fe9a56-3d56-4a0e-9de2-e72af85ab29e-host\") pod \"crc-debug-hhqbs\" (UID: \"f1fe9a56-3d56-4a0e-9de2-e72af85ab29e\") " pod="openshift-must-gather-c8zqw/crc-debug-hhqbs" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.670756 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnkkk\" (UniqueName: \"kubernetes.io/projected/f1fe9a56-3d56-4a0e-9de2-e72af85ab29e-kube-api-access-fnkkk\") pod \"crc-debug-hhqbs\" (UID: \"f1fe9a56-3d56-4a0e-9de2-e72af85ab29e\") " pod="openshift-must-gather-c8zqw/crc-debug-hhqbs" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.685033 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnkkk\" (UniqueName: \"kubernetes.io/projected/f1fe9a56-3d56-4a0e-9de2-e72af85ab29e-kube-api-access-fnkkk\") pod \"crc-debug-hhqbs\" (UID: \"f1fe9a56-3d56-4a0e-9de2-e72af85ab29e\") " pod="openshift-must-gather-c8zqw/crc-debug-hhqbs" Jan 31 09:56:51 crc kubenswrapper[4783]: I0131 09:56:51.806324 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c8zqw/crc-debug-hhqbs" Jan 31 09:56:52 crc kubenswrapper[4783]: I0131 09:56:52.195135 4783 generic.go:334] "Generic (PLEG): container finished" podID="f1fe9a56-3d56-4a0e-9de2-e72af85ab29e" containerID="dc504bbcd3b9f94b6ae6372404e13005e64f1870efb2a595034ac08687634319" exitCode=0 Jan 31 09:56:52 crc kubenswrapper[4783]: I0131 09:56:52.195202 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c8zqw/crc-debug-hhqbs" event={"ID":"f1fe9a56-3d56-4a0e-9de2-e72af85ab29e","Type":"ContainerDied","Data":"dc504bbcd3b9f94b6ae6372404e13005e64f1870efb2a595034ac08687634319"} Jan 31 09:56:52 crc kubenswrapper[4783]: I0131 09:56:52.195544 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c8zqw/crc-debug-hhqbs" event={"ID":"f1fe9a56-3d56-4a0e-9de2-e72af85ab29e","Type":"ContainerStarted","Data":"015aee9235d5e6c5576770e927c4205e567318121db6c731aadaddc996fa4086"} Jan 31 09:56:52 crc kubenswrapper[4783]: I0131 09:56:52.586126 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c8zqw/crc-debug-hhqbs"] Jan 31 09:56:52 crc kubenswrapper[4783]: I0131 09:56:52.593458 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c8zqw/crc-debug-hhqbs"] Jan 31 09:56:52 crc kubenswrapper[4783]: I0131 09:56:52.646568 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:56:52 crc kubenswrapper[4783]: E0131 09:56:52.646965 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.274208 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c8zqw/crc-debug-hhqbs" Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.299676 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1fe9a56-3d56-4a0e-9de2-e72af85ab29e-host\") pod \"f1fe9a56-3d56-4a0e-9de2-e72af85ab29e\" (UID: \"f1fe9a56-3d56-4a0e-9de2-e72af85ab29e\") " Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.299798 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1fe9a56-3d56-4a0e-9de2-e72af85ab29e-host" (OuterVolumeSpecName: "host") pod "f1fe9a56-3d56-4a0e-9de2-e72af85ab29e" (UID: "f1fe9a56-3d56-4a0e-9de2-e72af85ab29e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.300096 4783 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1fe9a56-3d56-4a0e-9de2-e72af85ab29e-host\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.401602 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnkkk\" (UniqueName: \"kubernetes.io/projected/f1fe9a56-3d56-4a0e-9de2-e72af85ab29e-kube-api-access-fnkkk\") pod \"f1fe9a56-3d56-4a0e-9de2-e72af85ab29e\" (UID: \"f1fe9a56-3d56-4a0e-9de2-e72af85ab29e\") " Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.407034 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1fe9a56-3d56-4a0e-9de2-e72af85ab29e-kube-api-access-fnkkk" (OuterVolumeSpecName: "kube-api-access-fnkkk") pod "f1fe9a56-3d56-4a0e-9de2-e72af85ab29e" (UID: "f1fe9a56-3d56-4a0e-9de2-e72af85ab29e"). InnerVolumeSpecName "kube-api-access-fnkkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.502895 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnkkk\" (UniqueName: \"kubernetes.io/projected/f1fe9a56-3d56-4a0e-9de2-e72af85ab29e-kube-api-access-fnkkk\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.653881 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1fe9a56-3d56-4a0e-9de2-e72af85ab29e" path="/var/lib/kubelet/pods/f1fe9a56-3d56-4a0e-9de2-e72af85ab29e/volumes" Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.787964 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c8zqw/crc-debug-k9s54"] Jan 31 09:56:53 crc kubenswrapper[4783]: E0131 09:56:53.788472 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1fe9a56-3d56-4a0e-9de2-e72af85ab29e" containerName="container-00" Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.788497 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1fe9a56-3d56-4a0e-9de2-e72af85ab29e" containerName="container-00" Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.788729 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1fe9a56-3d56-4a0e-9de2-e72af85ab29e" containerName="container-00" Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.789392 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c8zqw/crc-debug-k9s54" Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.827039 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc9c0fab-440f-4974-86df-ec2a7a38fb93-host\") pod \"crc-debug-k9s54\" (UID: \"cc9c0fab-440f-4974-86df-ec2a7a38fb93\") " pod="openshift-must-gather-c8zqw/crc-debug-k9s54" Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.827302 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jtqj\" (UniqueName: \"kubernetes.io/projected/cc9c0fab-440f-4974-86df-ec2a7a38fb93-kube-api-access-6jtqj\") pod \"crc-debug-k9s54\" (UID: \"cc9c0fab-440f-4974-86df-ec2a7a38fb93\") " pod="openshift-must-gather-c8zqw/crc-debug-k9s54" Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.928439 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc9c0fab-440f-4974-86df-ec2a7a38fb93-host\") pod \"crc-debug-k9s54\" (UID: \"cc9c0fab-440f-4974-86df-ec2a7a38fb93\") " pod="openshift-must-gather-c8zqw/crc-debug-k9s54" Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.928525 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jtqj\" (UniqueName: \"kubernetes.io/projected/cc9c0fab-440f-4974-86df-ec2a7a38fb93-kube-api-access-6jtqj\") pod \"crc-debug-k9s54\" (UID: \"cc9c0fab-440f-4974-86df-ec2a7a38fb93\") " pod="openshift-must-gather-c8zqw/crc-debug-k9s54" Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.928604 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc9c0fab-440f-4974-86df-ec2a7a38fb93-host\") pod \"crc-debug-k9s54\" (UID: \"cc9c0fab-440f-4974-86df-ec2a7a38fb93\") " pod="openshift-must-gather-c8zqw/crc-debug-k9s54" Jan 31 09:56:53 crc kubenswrapper[4783]: I0131 09:56:53.944641 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jtqj\" (UniqueName: \"kubernetes.io/projected/cc9c0fab-440f-4974-86df-ec2a7a38fb93-kube-api-access-6jtqj\") pod \"crc-debug-k9s54\" (UID: \"cc9c0fab-440f-4974-86df-ec2a7a38fb93\") " pod="openshift-must-gather-c8zqw/crc-debug-k9s54" Jan 31 09:56:54 crc kubenswrapper[4783]: I0131 09:56:54.103983 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c8zqw/crc-debug-k9s54" Jan 31 09:56:54 crc kubenswrapper[4783]: W0131 09:56:54.140762 4783 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc9c0fab_440f_4974_86df_ec2a7a38fb93.slice/crio-7af2d052acf8d26c524e4546971967059e167cc86f6a29de04c15c30025b97aa WatchSource:0}: Error finding container 7af2d052acf8d26c524e4546971967059e167cc86f6a29de04c15c30025b97aa: Status 404 returned error can't find the container with id 7af2d052acf8d26c524e4546971967059e167cc86f6a29de04c15c30025b97aa Jan 31 09:56:54 crc kubenswrapper[4783]: I0131 09:56:54.233471 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c8zqw/crc-debug-k9s54" event={"ID":"cc9c0fab-440f-4974-86df-ec2a7a38fb93","Type":"ContainerStarted","Data":"7af2d052acf8d26c524e4546971967059e167cc86f6a29de04c15c30025b97aa"} Jan 31 09:56:54 crc kubenswrapper[4783]: I0131 09:56:54.237411 4783 scope.go:117] "RemoveContainer" containerID="dc504bbcd3b9f94b6ae6372404e13005e64f1870efb2a595034ac08687634319" Jan 31 09:56:54 crc kubenswrapper[4783]: I0131 09:56:54.237699 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c8zqw/crc-debug-hhqbs" Jan 31 09:56:55 crc kubenswrapper[4783]: I0131 09:56:55.246420 4783 generic.go:334] "Generic (PLEG): container finished" podID="cc9c0fab-440f-4974-86df-ec2a7a38fb93" containerID="27f5014f1c6cd8e7e27ad74778dded831b5e25ad75a794eecaf99044c69f2455" exitCode=0 Jan 31 09:56:55 crc kubenswrapper[4783]: I0131 09:56:55.246523 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c8zqw/crc-debug-k9s54" event={"ID":"cc9c0fab-440f-4974-86df-ec2a7a38fb93","Type":"ContainerDied","Data":"27f5014f1c6cd8e7e27ad74778dded831b5e25ad75a794eecaf99044c69f2455"} Jan 31 09:56:55 crc kubenswrapper[4783]: I0131 09:56:55.282689 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c8zqw/crc-debug-k9s54"] Jan 31 09:56:55 crc kubenswrapper[4783]: I0131 09:56:55.290961 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c8zqw/crc-debug-k9s54"] Jan 31 09:56:56 crc kubenswrapper[4783]: I0131 09:56:56.330141 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c8zqw/crc-debug-k9s54" Jan 31 09:56:56 crc kubenswrapper[4783]: I0131 09:56:56.375825 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc9c0fab-440f-4974-86df-ec2a7a38fb93-host\") pod \"cc9c0fab-440f-4974-86df-ec2a7a38fb93\" (UID: \"cc9c0fab-440f-4974-86df-ec2a7a38fb93\") " Jan 31 09:56:56 crc kubenswrapper[4783]: I0131 09:56:56.375929 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc9c0fab-440f-4974-86df-ec2a7a38fb93-host" (OuterVolumeSpecName: "host") pod "cc9c0fab-440f-4974-86df-ec2a7a38fb93" (UID: "cc9c0fab-440f-4974-86df-ec2a7a38fb93"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:56:56 crc kubenswrapper[4783]: I0131 09:56:56.376444 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jtqj\" (UniqueName: \"kubernetes.io/projected/cc9c0fab-440f-4974-86df-ec2a7a38fb93-kube-api-access-6jtqj\") pod \"cc9c0fab-440f-4974-86df-ec2a7a38fb93\" (UID: \"cc9c0fab-440f-4974-86df-ec2a7a38fb93\") " Jan 31 09:56:56 crc kubenswrapper[4783]: I0131 09:56:56.376920 4783 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc9c0fab-440f-4974-86df-ec2a7a38fb93-host\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:56 crc kubenswrapper[4783]: I0131 09:56:56.383317 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc9c0fab-440f-4974-86df-ec2a7a38fb93-kube-api-access-6jtqj" (OuterVolumeSpecName: "kube-api-access-6jtqj") pod "cc9c0fab-440f-4974-86df-ec2a7a38fb93" (UID: "cc9c0fab-440f-4974-86df-ec2a7a38fb93"). InnerVolumeSpecName "kube-api-access-6jtqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:56:56 crc kubenswrapper[4783]: I0131 09:56:56.479416 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jtqj\" (UniqueName: \"kubernetes.io/projected/cc9c0fab-440f-4974-86df-ec2a7a38fb93-kube-api-access-6jtqj\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:57 crc kubenswrapper[4783]: I0131 09:56:57.267397 4783 scope.go:117] "RemoveContainer" containerID="27f5014f1c6cd8e7e27ad74778dded831b5e25ad75a794eecaf99044c69f2455" Jan 31 09:56:57 crc kubenswrapper[4783]: I0131 09:56:57.267426 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c8zqw/crc-debug-k9s54" Jan 31 09:56:57 crc kubenswrapper[4783]: I0131 09:56:57.654052 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc9c0fab-440f-4974-86df-ec2a7a38fb93" path="/var/lib/kubelet/pods/cc9c0fab-440f-4974-86df-ec2a7a38fb93/volumes" Jan 31 09:57:04 crc kubenswrapper[4783]: I0131 09:57:04.646088 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:57:04 crc kubenswrapper[4783]: E0131 09:57:04.646927 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:57:17 crc kubenswrapper[4783]: I0131 09:57:17.646323 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:57:17 crc kubenswrapper[4783]: E0131 09:57:17.647022 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:57:23 crc kubenswrapper[4783]: I0131 09:57:23.587549 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7994d94564-47gt2_6ec6cc70-cb74-40f6-acb3-3423b5045651/barbican-api/0.log" Jan 31 09:57:23 crc kubenswrapper[4783]: I0131 09:57:23.727103 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7994d94564-47gt2_6ec6cc70-cb74-40f6-acb3-3423b5045651/barbican-api-log/0.log" Jan 31 09:57:23 crc kubenswrapper[4783]: I0131 09:57:23.733270 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5cf88c75b6-glzzx_3cd5ec39-81e0-44cd-b99f-01e3d301b192/barbican-keystone-listener/0.log" Jan 31 09:57:23 crc kubenswrapper[4783]: I0131 09:57:23.752954 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5cf88c75b6-glzzx_3cd5ec39-81e0-44cd-b99f-01e3d301b192/barbican-keystone-listener-log/0.log" Jan 31 09:57:23 crc kubenswrapper[4783]: I0131 09:57:23.875405 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c946647fc-lsk5p_814c0372-f441-4fce-b7d3-47827597fdd5/barbican-worker/0.log" Jan 31 09:57:23 crc kubenswrapper[4783]: I0131 09:57:23.927619 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c946647fc-lsk5p_814c0372-f441-4fce-b7d3-47827597fdd5/barbican-worker-log/0.log" Jan 31 09:57:24 crc kubenswrapper[4783]: I0131 09:57:24.031319 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-77zkj_b55d8f8e-46cb-4119-a32b-723b06e29764/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:57:24 crc kubenswrapper[4783]: I0131 09:57:24.120643 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4c9fe6f4-d5e6-4f59-8803-3e889c863d6c/ceilometer-central-agent/0.log" Jan 31 09:57:24 crc kubenswrapper[4783]: I0131 09:57:24.141552 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4c9fe6f4-d5e6-4f59-8803-3e889c863d6c/ceilometer-notification-agent/0.log" Jan 31 09:57:24 crc kubenswrapper[4783]: I0131 09:57:24.223760 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4c9fe6f4-d5e6-4f59-8803-3e889c863d6c/proxy-httpd/0.log" Jan 31 09:57:24 crc kubenswrapper[4783]: I0131 09:57:24.287328 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4c9fe6f4-d5e6-4f59-8803-3e889c863d6c/sg-core/0.log" Jan 31 09:57:24 crc kubenswrapper[4783]: I0131 09:57:24.344980 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_efc715c4-8350-4307-91b8-d33c62513e41/cinder-api/0.log" Jan 31 09:57:24 crc kubenswrapper[4783]: I0131 09:57:24.409750 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_efc715c4-8350-4307-91b8-d33c62513e41/cinder-api-log/0.log" Jan 31 09:57:24 crc kubenswrapper[4783]: I0131 09:57:24.510283 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1eb12305-93aa-4b0a-960a-939eb7b74bec/probe/0.log" Jan 31 09:57:24 crc kubenswrapper[4783]: I0131 09:57:24.562029 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1eb12305-93aa-4b0a-960a-939eb7b74bec/cinder-scheduler/0.log" Jan 31 09:57:24 crc kubenswrapper[4783]: I0131 09:57:24.660844 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-sjnvj_4c333cb5-3633-4cfe-825d-abc93c751acd/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:57:24 crc kubenswrapper[4783]: I0131 09:57:24.719409 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-pbbxj_49b1076f-b620-47cc-8cbf-c70ecdbeab06/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:57:24 crc kubenswrapper[4783]: I0131 09:57:24.830999 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-m8kdr_19000338-c242-44b7-a9e2-1a0c0c15f58b/init/0.log" Jan 31 09:57:24 crc kubenswrapper[4783]: I0131 09:57:24.949831 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-m8kdr_19000338-c242-44b7-a9e2-1a0c0c15f58b/init/0.log" Jan 31 09:57:24 crc kubenswrapper[4783]: I0131 09:57:24.992328 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-l8bgc_f6315c3c-0101-4935-b081-37414dd7e27e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:57:25 crc kubenswrapper[4783]: I0131 09:57:25.028139 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-m8kdr_19000338-c242-44b7-a9e2-1a0c0c15f58b/dnsmasq-dns/0.log" Jan 31 09:57:25 crc kubenswrapper[4783]: I0131 09:57:25.184876 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7b9825cb-8bd2-446b-80ab-d6bdd294d51d/glance-httpd/0.log" Jan 31 09:57:25 crc kubenswrapper[4783]: I0131 09:57:25.216563 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7b9825cb-8bd2-446b-80ab-d6bdd294d51d/glance-log/0.log" Jan 31 09:57:25 crc kubenswrapper[4783]: I0131 09:57:25.337576 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6c4aab5d-1107-4452-9f07-fc45f446eb01/glance-httpd/0.log" Jan 31 09:57:25 crc kubenswrapper[4783]: I0131 09:57:25.384975 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6c4aab5d-1107-4452-9f07-fc45f446eb01/glance-log/0.log" Jan 31 09:57:25 crc kubenswrapper[4783]: I0131 09:57:25.472091 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6644bf8978-q24zg_940e2e96-d6a1-4576-b83a-e30ff1f6ab85/horizon/0.log" Jan 31 09:57:25 crc kubenswrapper[4783]: I0131 09:57:25.729057 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6644bf8978-q24zg_940e2e96-d6a1-4576-b83a-e30ff1f6ab85/horizon-log/0.log" Jan 31 09:57:25 crc kubenswrapper[4783]: I0131 09:57:25.873553 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-jf9tk_b59b9070-af55-4915-bae2-414ca2aab1b7/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:57:25 crc kubenswrapper[4783]: I0131 09:57:25.979249 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-c9hls_77285e40-3f9b-491d-a911-0f7b5c8058fb/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:57:26 crc kubenswrapper[4783]: I0131 09:57:26.179090 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-688b79757c-l8xjk_5fc52c5d-ccc0-431d-a57b-34efa4d1f1ae/keystone-api/0.log" Jan 31 09:57:26 crc kubenswrapper[4783]: I0131 09:57:26.188392 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_7c07732e-abfe-48cc-86c9-b500fff4977d/kube-state-metrics/0.log" Jan 31 09:57:26 crc kubenswrapper[4783]: I0131 09:57:26.318394 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-j6xgm_777883d7-012b-4006-afcb-d5fcd8a0eb68/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:57:26 crc kubenswrapper[4783]: I0131 09:57:26.602714 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-754bf5467-627tt_e522ac0d-9e88-42f8-82a7-54cb22a15841/neutron-api/0.log" Jan 31 09:57:26 crc kubenswrapper[4783]: I0131 09:57:26.657410 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-ntcn2_a22a7456-83e3-46ef-80c2-ebea731972b9/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:57:26 crc kubenswrapper[4783]: I0131 09:57:26.673270 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-754bf5467-627tt_e522ac0d-9e88-42f8-82a7-54cb22a15841/neutron-httpd/0.log" Jan 31 09:57:27 crc kubenswrapper[4783]: I0131 09:57:27.187264 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b4163808-2c2d-4fdd-a2b3-84b36dfa4112/nova-api-log/0.log" Jan 31 09:57:27 crc kubenswrapper[4783]: I0131 09:57:27.269270 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_234cb04e-3bcc-47d4-95ef-16eb1c8ad3c7/nova-cell0-conductor-conductor/0.log" Jan 31 09:57:27 crc kubenswrapper[4783]: I0131 09:57:27.549264 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_5d42e544-b623-439f-b4a8-9ee7cb72386c/nova-cell1-conductor-conductor/0.log" Jan 31 09:57:27 crc kubenswrapper[4783]: I0131 09:57:27.616547 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_4655d42d-6876-4b07-bde2-d8a70c62018d/nova-cell1-novncproxy-novncproxy/0.log" Jan 31 09:57:27 crc kubenswrapper[4783]: I0131 09:57:27.633521 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b4163808-2c2d-4fdd-a2b3-84b36dfa4112/nova-api-api/0.log" Jan 31 09:57:27 crc kubenswrapper[4783]: I0131 09:57:27.763786 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-q6sd4_57ec9c0f-9c30-4c10-afd7-84ac778f9069/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:57:27 crc kubenswrapper[4783]: I0131 09:57:27.937361 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_011e6c87-b549-4268-b64e-b5d49c9e7cd8/nova-metadata-log/0.log" Jan 31 09:57:28 crc kubenswrapper[4783]: I0131 09:57:28.183235 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_43315408-26b4-4864-af4e-e3cbad195816/nova-scheduler-scheduler/0.log" Jan 31 09:57:28 crc kubenswrapper[4783]: I0131 09:57:28.224119 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c9a5bd57-8542-4509-a620-c48d2f6c9e06/mysql-bootstrap/0.log" Jan 31 09:57:28 crc kubenswrapper[4783]: I0131 09:57:28.441014 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c9a5bd57-8542-4509-a620-c48d2f6c9e06/galera/0.log" Jan 31 09:57:28 crc kubenswrapper[4783]: I0131 09:57:28.448884 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c9a5bd57-8542-4509-a620-c48d2f6c9e06/mysql-bootstrap/0.log" Jan 31 09:57:28 crc kubenswrapper[4783]: I0131 09:57:28.827025 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_03eade59-3312-49be-a51a-9fdcd37f9a33/mysql-bootstrap/0.log" Jan 31 09:57:28 crc kubenswrapper[4783]: I0131 09:57:28.847184 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_011e6c87-b549-4268-b64e-b5d49c9e7cd8/nova-metadata-metadata/0.log" Jan 31 09:57:28 crc kubenswrapper[4783]: I0131 09:57:28.978619 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_03eade59-3312-49be-a51a-9fdcd37f9a33/mysql-bootstrap/0.log" Jan 31 09:57:28 crc kubenswrapper[4783]: I0131 09:57:28.982453 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_03eade59-3312-49be-a51a-9fdcd37f9a33/galera/0.log" Jan 31 09:57:29 crc kubenswrapper[4783]: I0131 09:57:29.096472 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b1789966-6119-4be7-87b8-cca3381fc380/openstackclient/0.log" Jan 31 09:57:29 crc kubenswrapper[4783]: I0131 09:57:29.199696 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-fmnfb_7e49f761-fcba-4ec3-9091-61f056e4eb58/openstack-network-exporter/0.log" Jan 31 09:57:29 crc kubenswrapper[4783]: I0131 09:57:29.307195 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7st6_8575964a-bedb-456c-b992-116f66bb7fa2/ovsdb-server-init/0.log" Jan 31 09:57:29 crc kubenswrapper[4783]: I0131 09:57:29.489906 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7st6_8575964a-bedb-456c-b992-116f66bb7fa2/ovsdb-server-init/0.log" Jan 31 09:57:29 crc kubenswrapper[4783]: I0131 09:57:29.489956 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7st6_8575964a-bedb-456c-b992-116f66bb7fa2/ovs-vswitchd/0.log" Jan 31 09:57:29 crc kubenswrapper[4783]: I0131 09:57:29.506189 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7st6_8575964a-bedb-456c-b992-116f66bb7fa2/ovsdb-server/0.log" Jan 31 09:57:29 crc kubenswrapper[4783]: I0131 09:57:29.672243 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rll65_ea827790-18ef-4c55-8b5f-365ead9b9f6c/ovn-controller/0.log" Jan 31 09:57:29 crc kubenswrapper[4783]: I0131 09:57:29.833111 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-zp2tb_cf5af9ea-73f9-4316-8fdc-abe4ede8632a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:57:29 crc kubenswrapper[4783]: I0131 09:57:29.863266 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_273da0f4-592f-4736-8435-28cd6f46ed55/openstack-network-exporter/0.log" Jan 31 09:57:29 crc kubenswrapper[4783]: I0131 09:57:29.972733 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_273da0f4-592f-4736-8435-28cd6f46ed55/ovn-northd/0.log" Jan 31 09:57:30 crc kubenswrapper[4783]: I0131 09:57:30.004126 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8e90cb35-3366-44df-9238-3da82d300654/openstack-network-exporter/0.log" Jan 31 09:57:30 crc kubenswrapper[4783]: I0131 09:57:30.061003 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8e90cb35-3366-44df-9238-3da82d300654/ovsdbserver-nb/0.log" Jan 31 09:57:30 crc kubenswrapper[4783]: I0131 09:57:30.195588 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bed60581-fb96-4e66-bd14-2e2c0f75a771/openstack-network-exporter/0.log" Jan 31 09:57:30 crc kubenswrapper[4783]: I0131 09:57:30.245506 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bed60581-fb96-4e66-bd14-2e2c0f75a771/ovsdbserver-sb/0.log" Jan 31 09:57:30 crc kubenswrapper[4783]: I0131 09:57:30.518310 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5df45c7c98-nt6z5_05962991-82e5-4c31-87fa-c7df3cba5f90/placement-api/0.log" Jan 31 09:57:30 crc kubenswrapper[4783]: I0131 09:57:30.559467 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5df45c7c98-nt6z5_05962991-82e5-4c31-87fa-c7df3cba5f90/placement-log/0.log" Jan 31 09:57:30 crc kubenswrapper[4783]: I0131 09:57:30.634988 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_99ac760c-1287-4674-9133-ee9124e9fbbd/setup-container/0.log" Jan 31 09:57:30 crc kubenswrapper[4783]: I0131 09:57:30.647541 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:57:30 crc kubenswrapper[4783]: E0131 09:57:30.648053 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:57:30 crc kubenswrapper[4783]: I0131 09:57:30.782777 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_99ac760c-1287-4674-9133-ee9124e9fbbd/setup-container/0.log" Jan 31 09:57:30 crc kubenswrapper[4783]: I0131 09:57:30.826097 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4902a1ee-5b54-48bd-b8fb-8be63db315a5/setup-container/0.log" Jan 31 09:57:30 crc kubenswrapper[4783]: I0131 09:57:30.850839 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_99ac760c-1287-4674-9133-ee9124e9fbbd/rabbitmq/0.log" Jan 31 09:57:30 crc kubenswrapper[4783]: I0131 09:57:30.999049 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4902a1ee-5b54-48bd-b8fb-8be63db315a5/setup-container/0.log" Jan 31 09:57:31 crc kubenswrapper[4783]: I0131 09:57:31.068344 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4902a1ee-5b54-48bd-b8fb-8be63db315a5/rabbitmq/0.log" Jan 31 09:57:31 crc kubenswrapper[4783]: I0131 09:57:31.106988 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-kp7kb_cf6bea6a-6877-4624-b8ac-cfd51fb514a9/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:57:31 crc kubenswrapper[4783]: I0131 09:57:31.287224 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-trw4c_968fb64b-ed6e-492b-a508-41dd3dd98085/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:57:31 crc kubenswrapper[4783]: I0131 09:57:31.294060 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-xpb29_f3679d22-7479-40f7-9c8b-2e0caa156965/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:57:31 crc kubenswrapper[4783]: I0131 09:57:31.518792 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-5mpq2_35e3815e-af8f-4724-846b-ea6038002f70/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:57:31 crc kubenswrapper[4783]: I0131 09:57:31.521248 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-bmbrc_e9ce688e-1f0e-486c-b3c7-4b45243713ed/ssh-known-hosts-edpm-deployment/0.log" Jan 31 09:57:31 crc kubenswrapper[4783]: I0131 09:57:31.704342 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7846c976fc-knpz2_ff4d96b6-b227-41e4-a653-39b8475aa9de/proxy-server/0.log" Jan 31 09:57:31 crc kubenswrapper[4783]: I0131 09:57:31.849381 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7846c976fc-knpz2_ff4d96b6-b227-41e4-a653-39b8475aa9de/proxy-httpd/0.log" Jan 31 09:57:31 crc kubenswrapper[4783]: I0131 09:57:31.868494 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-58vfp_6827ccb1-8fcf-4451-a878-25d3d5765ae6/swift-ring-rebalance/0.log" Jan 31 09:57:31 crc kubenswrapper[4783]: I0131 09:57:31.996651 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/account-auditor/0.log" Jan 31 09:57:32 crc kubenswrapper[4783]: I0131 09:57:32.107005 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/account-replicator/0.log" Jan 31 09:57:32 crc kubenswrapper[4783]: I0131 09:57:32.114195 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/account-reaper/0.log" Jan 31 09:57:32 crc kubenswrapper[4783]: I0131 09:57:32.134980 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/account-server/0.log" Jan 31 09:57:32 crc kubenswrapper[4783]: I0131 09:57:32.205300 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/container-auditor/0.log" Jan 31 09:57:32 crc kubenswrapper[4783]: I0131 09:57:32.301350 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/container-server/0.log" Jan 31 09:57:32 crc kubenswrapper[4783]: I0131 09:57:32.303592 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/container-updater/0.log" Jan 31 09:57:32 crc kubenswrapper[4783]: I0131 09:57:32.361731 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/container-replicator/0.log" Jan 31 09:57:32 crc kubenswrapper[4783]: I0131 09:57:32.496040 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/object-auditor/0.log" Jan 31 09:57:32 crc kubenswrapper[4783]: I0131 09:57:32.519044 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/object-expirer/0.log" Jan 31 09:57:32 crc kubenswrapper[4783]: I0131 09:57:32.539302 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/object-replicator/0.log" Jan 31 09:57:32 crc kubenswrapper[4783]: I0131 09:57:32.561408 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/object-server/0.log" Jan 31 09:57:32 crc kubenswrapper[4783]: I0131 09:57:32.673524 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/object-updater/0.log" Jan 31 09:57:32 crc kubenswrapper[4783]: I0131 09:57:32.716750 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/rsync/0.log" Jan 31 09:57:32 crc kubenswrapper[4783]: I0131 09:57:32.754914 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c78f0039-d432-4056-a572-d3049488bb75/swift-recon-cron/0.log" Jan 31 09:57:32 crc kubenswrapper[4783]: I0131 09:57:32.967557 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_16292449-5a29-426e-aa57-18e752dd60f6/tempest-tests-tempest-tests-runner/0.log" Jan 31 09:57:32 crc kubenswrapper[4783]: I0131 09:57:32.980713 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-xfj2g_36f447a7-72aa-465f-8fad-1c0bb7c71a9e/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:57:33 crc kubenswrapper[4783]: I0131 09:57:33.126981 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_3438915b-514e-4298-842a-c77aefe49803/test-operator-logs-container/0.log" Jan 31 09:57:33 crc kubenswrapper[4783]: I0131 09:57:33.193014 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-97pwd_1f0db336-229b-4d05-b3e9-aaa8b26b08c4/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 09:57:43 crc kubenswrapper[4783]: I0131 09:57:43.514439 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c967214f-ce54-4ac2-ae54-2d750133ff97/memcached/0.log" Jan 31 09:57:44 crc kubenswrapper[4783]: I0131 09:57:44.645946 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:57:44 crc kubenswrapper[4783]: E0131 09:57:44.646504 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:57:55 crc kubenswrapper[4783]: I0131 09:57:55.107563 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv_297a5b92-55db-4a84-8bd3-878ea32367df/util/0.log" Jan 31 09:57:55 crc kubenswrapper[4783]: I0131 09:57:55.222705 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv_297a5b92-55db-4a84-8bd3-878ea32367df/util/0.log" Jan 31 09:57:55 crc kubenswrapper[4783]: I0131 09:57:55.247428 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv_297a5b92-55db-4a84-8bd3-878ea32367df/pull/0.log" Jan 31 09:57:55 crc kubenswrapper[4783]: I0131 09:57:55.304918 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv_297a5b92-55db-4a84-8bd3-878ea32367df/pull/0.log" Jan 31 09:57:55 crc kubenswrapper[4783]: I0131 09:57:55.400442 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv_297a5b92-55db-4a84-8bd3-878ea32367df/util/0.log" Jan 31 09:57:55 crc kubenswrapper[4783]: I0131 09:57:55.401384 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv_297a5b92-55db-4a84-8bd3-878ea32367df/extract/0.log" Jan 31 09:57:55 crc kubenswrapper[4783]: I0131 09:57:55.463760 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1n5znv_297a5b92-55db-4a84-8bd3-878ea32367df/pull/0.log" Jan 31 09:57:55 crc kubenswrapper[4783]: I0131 09:57:55.605587 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-gwkhk_3a079322-76ea-4cb9-a8b6-3f0b1a360086/manager/0.log" Jan 31 09:57:55 crc kubenswrapper[4783]: I0131 09:57:55.632520 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-vjgpf_99914340-4708-4322-996f-7392f6fe6e02/manager/0.log" Jan 31 09:57:55 crc kubenswrapper[4783]: I0131 09:57:55.744987 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-k9948_7ddb9dd0-fc57-4685-a7d5-778a4152ea58/manager/0.log" Jan 31 09:57:55 crc kubenswrapper[4783]: I0131 09:57:55.861702 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-btfdw_e4606683-7a0b-4a0f-ae81-3c6e598a36e6/manager/0.log" Jan 31 09:57:55 crc kubenswrapper[4783]: I0131 09:57:55.900919 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-ln9h6_1e1e07c3-0aeb-47fd-be71-a13716a04f29/manager/0.log" Jan 31 09:57:56 crc kubenswrapper[4783]: I0131 09:57:56.018409 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-4nwdm_1de23cab-104f-49ac-ab9f-3b1d08733ff9/manager/0.log" Jan 31 09:57:56 crc kubenswrapper[4783]: I0131 09:57:56.266094 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-82z66_bbcb9fbf-dab0-4029-b8a8-9e6f13bdf352/manager/0.log" Jan 31 09:57:56 crc kubenswrapper[4783]: I0131 09:57:56.292055 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-rn96f_5033c800-ef69-4228-a204-b66401c4725c/manager/0.log" Jan 31 09:57:56 crc kubenswrapper[4783]: I0131 09:57:56.429512 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-t7nxg_cd28073a-c4f4-4b1d-9680-a9d5a5939deb/manager/0.log" Jan 31 09:57:56 crc kubenswrapper[4783]: I0131 09:57:56.452157 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-97h8r_8fdee142-92ef-49d1-bac6-6f6c3873b2cb/manager/0.log" Jan 31 09:57:56 crc kubenswrapper[4783]: I0131 09:57:56.604721 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-bxv5z_acb9dbfe-e754-4021-bc54-7ccd17b217a4/manager/0.log" Jan 31 09:57:56 crc kubenswrapper[4783]: I0131 09:57:56.645369 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:57:56 crc kubenswrapper[4783]: E0131 09:57:56.645670 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:57:56 crc kubenswrapper[4783]: I0131 09:57:56.664223 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-fvml6_be9f1345-8ca5-49da-a52e-4b841ea07ac3/manager/0.log" Jan 31 09:57:56 crc kubenswrapper[4783]: I0131 09:57:56.834617 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-s842k_fa110500-5ebd-4645-86a1-e3bf4b9780fe/manager/0.log" Jan 31 09:57:56 crc kubenswrapper[4783]: I0131 09:57:56.847329 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-tmf9x_8178283d-c10c-45e6-a465-bdb5096d8904/manager/0.log" Jan 31 09:57:57 crc kubenswrapper[4783]: I0131 09:57:57.009470 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86dfb79cc7mwgsp_10649b85-8b3a-44e2-9477-6f5821d232a7/manager/0.log" Jan 31 09:57:57 crc kubenswrapper[4783]: I0131 09:57:57.076404 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-757f46c65d-s5jcf_5cdb2757-415c-4ffe-bcb1-0c07dfeee1ab/operator/0.log" Jan 31 09:57:57 crc kubenswrapper[4783]: I0131 09:57:57.308257 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-z2jkn_474ac824-d8f5-4d2b-9b6f-c385808d57b8/registry-server/0.log" Jan 31 09:57:57 crc kubenswrapper[4783]: I0131 09:57:57.454329 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-x86p2_caf4a0bd-2f55-4185-a756-4a640cbfe8d3/manager/0.log" Jan 31 09:57:57 crc kubenswrapper[4783]: I0131 09:57:57.526190 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-x6rmh_5bc18646-6e2c-41b5-8690-b6b7eda1a8cc/manager/0.log" Jan 31 09:57:57 crc kubenswrapper[4783]: I0131 09:57:57.740405 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pfgj4_2d0fc101-8afd-4154-9741-d5d3520990fe/operator/0.log" Jan 31 09:57:57 crc kubenswrapper[4783]: I0131 09:57:57.922793 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-5hw2c_c42112c2-917b-491b-9a4a-5253a0fc8d09/manager/0.log" Jan 31 09:57:58 crc kubenswrapper[4783]: I0131 09:57:58.031935 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-64b5b76f97-8dwmb_8ad3860f-1b60-4522-8026-08212156646d/manager/0.log" Jan 31 09:57:58 crc kubenswrapper[4783]: I0131 09:57:58.105335 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-9d95x_d6255583-1dd4-4901-b3af-8619aa03434b/manager/0.log" Jan 31 09:57:58 crc kubenswrapper[4783]: I0131 09:57:58.195722 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6b6f655c79-jqzh7_6360a87f-2ddf-4c17-9f25-cff4e0f5e747/manager/0.log" Jan 31 09:57:58 crc kubenswrapper[4783]: I0131 09:57:58.321931 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-sbgw8_b4b9ea20-ea14-4c87-b40e-5767debc9f57/manager/0.log" Jan 31 09:58:08 crc kubenswrapper[4783]: I0131 09:58:08.646823 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:58:08 crc kubenswrapper[4783]: E0131 09:58:08.647756 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:58:13 crc kubenswrapper[4783]: I0131 09:58:13.747942 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-t4qqm_a3f564d8-5f07-446d-9dd1-955e39d4a5f4/control-plane-machine-set-operator/0.log" Jan 31 09:58:13 crc kubenswrapper[4783]: I0131 09:58:13.908675 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-m54sp_0e337b91-adb8-4cb7-8e5e-be2b80e78f56/kube-rbac-proxy/0.log" Jan 31 09:58:13 crc kubenswrapper[4783]: I0131 09:58:13.947498 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-m54sp_0e337b91-adb8-4cb7-8e5e-be2b80e78f56/machine-api-operator/0.log" Jan 31 09:58:19 crc kubenswrapper[4783]: I0131 09:58:19.651704 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:58:19 crc kubenswrapper[4783]: E0131 09:58:19.652560 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:58:24 crc kubenswrapper[4783]: I0131 09:58:24.714503 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-bvf4q_5ac717c7-d413-4765-9bf5-b0d7ad8163c6/cert-manager-controller/0.log" Jan 31 09:58:24 crc kubenswrapper[4783]: I0131 09:58:24.890283 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-mhfqz_c0721d7b-1d48-4f0f-bdba-4e2afa8cf7dd/cert-manager-cainjector/0.log" Jan 31 09:58:24 crc kubenswrapper[4783]: I0131 09:58:24.924595 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-bhlz2_db7bedab-fb68-4a6a-887c-d2aa1a63d0ee/cert-manager-webhook/0.log" Jan 31 09:58:31 crc kubenswrapper[4783]: I0131 09:58:31.645793 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:58:31 crc kubenswrapper[4783]: E0131 09:58:31.646479 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:58:35 crc kubenswrapper[4783]: I0131 09:58:35.463543 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-58rxc_dd570070-6aad-4b28-aefd-e4e2ce7e6a8c/nmstate-console-plugin/0.log" Jan 31 09:58:35 crc kubenswrapper[4783]: I0131 09:58:35.573546 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-vc2gq_4b90a875-3ddc-4ba4-a62a-dd83c9de4d59/nmstate-handler/0.log" Jan 31 09:58:35 crc kubenswrapper[4783]: I0131 09:58:35.625931 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-kw4vm_20903e03-98bb-4970-b9b9-9088bfbd1902/kube-rbac-proxy/0.log" Jan 31 09:58:35 crc kubenswrapper[4783]: I0131 09:58:35.673041 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-kw4vm_20903e03-98bb-4970-b9b9-9088bfbd1902/nmstate-metrics/0.log" Jan 31 09:58:35 crc kubenswrapper[4783]: I0131 09:58:35.794247 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-7nq22_66e4885b-8227-4433-8208-12dad761b627/nmstate-operator/0.log" Jan 31 09:58:36 crc kubenswrapper[4783]: I0131 09:58:36.002814 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-2cm9k_fe6808ce-fbb6-4782-831b-892b074b7267/nmstate-webhook/0.log" Jan 31 09:58:45 crc kubenswrapper[4783]: I0131 09:58:45.646553 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:58:45 crc kubenswrapper[4783]: E0131 09:58:45.647873 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:58:57 crc kubenswrapper[4783]: I0131 09:58:57.646362 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:58:57 crc kubenswrapper[4783]: E0131 09:58:57.647584 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:58:58 crc kubenswrapper[4783]: I0131 09:58:58.750420 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-qcxmw_90502cfa-f884-4181-b14c-98b49f254530/kube-rbac-proxy/0.log" Jan 31 09:58:58 crc kubenswrapper[4783]: I0131 09:58:58.866786 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-qcxmw_90502cfa-f884-4181-b14c-98b49f254530/controller/0.log" Jan 31 09:58:58 crc kubenswrapper[4783]: I0131 09:58:58.984913 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-2q4pc_1e56e9fc-1576-4315-97b2-fa45c03bb8ca/frr-k8s-webhook-server/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.019449 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-frr-files/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.233790 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-frr-files/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.238133 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-reloader/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.241298 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-reloader/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.249544 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-metrics/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.420983 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-reloader/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.427805 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-metrics/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.429864 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-metrics/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.451892 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-frr-files/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.565902 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-frr-files/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.587178 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-reloader/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.607016 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/cp-metrics/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.653137 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/controller/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.780533 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/frr-metrics/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.820775 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/kube-rbac-proxy/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.844587 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/kube-rbac-proxy-frr/0.log" Jan 31 09:58:59 crc kubenswrapper[4783]: I0131 09:58:59.954812 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/reloader/0.log" Jan 31 09:59:00 crc kubenswrapper[4783]: I0131 09:59:00.040719 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84cd58cb5d-hkgvw_db52c704-6d54-4f49-9168-903b12ed4a25/manager/0.log" Jan 31 09:59:00 crc kubenswrapper[4783]: I0131 09:59:00.229053 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6cbf6c4975-7nlfs_9392ad71-70f9-4727-baaa-68ddfa6b3361/webhook-server/0.log" Jan 31 09:59:00 crc kubenswrapper[4783]: I0131 09:59:00.419271 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hn6z7_9d4ef83f-da80-4f86-8e3f-6618d9bd5c44/kube-rbac-proxy/0.log" Jan 31 09:59:00 crc kubenswrapper[4783]: I0131 09:59:00.810437 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hn6z7_9d4ef83f-da80-4f86-8e3f-6618d9bd5c44/speaker/0.log" Jan 31 09:59:00 crc kubenswrapper[4783]: I0131 09:59:00.985287 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x77mn_f71dbce1-2082-4ee0-8b6b-21fdf4313b06/frr/0.log" Jan 31 09:59:07 crc kubenswrapper[4783]: I0131 09:59:07.922532 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m9p8q"] Jan 31 09:59:07 crc kubenswrapper[4783]: E0131 09:59:07.923522 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc9c0fab-440f-4974-86df-ec2a7a38fb93" containerName="container-00" Jan 31 09:59:07 crc kubenswrapper[4783]: I0131 09:59:07.923540 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc9c0fab-440f-4974-86df-ec2a7a38fb93" containerName="container-00" Jan 31 09:59:07 crc kubenswrapper[4783]: I0131 09:59:07.923774 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc9c0fab-440f-4974-86df-ec2a7a38fb93" containerName="container-00" Jan 31 09:59:07 crc kubenswrapper[4783]: I0131 09:59:07.925968 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:07 crc kubenswrapper[4783]: I0131 09:59:07.933841 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m9p8q"] Jan 31 09:59:08 crc kubenswrapper[4783]: I0131 09:59:08.002073 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrwm4\" (UniqueName: \"kubernetes.io/projected/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-kube-api-access-lrwm4\") pod \"community-operators-m9p8q\" (UID: \"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf\") " pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:08 crc kubenswrapper[4783]: I0131 09:59:08.002449 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-utilities\") pod \"community-operators-m9p8q\" (UID: \"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf\") " pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:08 crc kubenswrapper[4783]: I0131 09:59:08.002847 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-catalog-content\") pod \"community-operators-m9p8q\" (UID: \"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf\") " pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:08 crc kubenswrapper[4783]: I0131 09:59:08.104339 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-catalog-content\") pod \"community-operators-m9p8q\" (UID: \"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf\") " pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:08 crc kubenswrapper[4783]: I0131 09:59:08.104479 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrwm4\" (UniqueName: \"kubernetes.io/projected/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-kube-api-access-lrwm4\") pod \"community-operators-m9p8q\" (UID: \"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf\") " pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:08 crc kubenswrapper[4783]: I0131 09:59:08.104513 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-utilities\") pod \"community-operators-m9p8q\" (UID: \"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf\") " pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:08 crc kubenswrapper[4783]: I0131 09:59:08.104826 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-catalog-content\") pod \"community-operators-m9p8q\" (UID: \"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf\") " pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:08 crc kubenswrapper[4783]: I0131 09:59:08.105075 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-utilities\") pod \"community-operators-m9p8q\" (UID: \"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf\") " pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:08 crc kubenswrapper[4783]: I0131 09:59:08.121693 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrwm4\" (UniqueName: \"kubernetes.io/projected/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-kube-api-access-lrwm4\") pod \"community-operators-m9p8q\" (UID: \"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf\") " pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:08 crc kubenswrapper[4783]: I0131 09:59:08.244729 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:08 crc kubenswrapper[4783]: I0131 09:59:08.694012 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m9p8q"] Jan 31 09:59:09 crc kubenswrapper[4783]: I0131 09:59:09.347825 4783 generic.go:334] "Generic (PLEG): container finished" podID="0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf" containerID="4a1889ed6f5493f5ad65a1ef95263996b0777b5ce7ca8898626f31b19adfa8e2" exitCode=0 Jan 31 09:59:09 crc kubenswrapper[4783]: I0131 09:59:09.348023 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9p8q" event={"ID":"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf","Type":"ContainerDied","Data":"4a1889ed6f5493f5ad65a1ef95263996b0777b5ce7ca8898626f31b19adfa8e2"} Jan 31 09:59:09 crc kubenswrapper[4783]: I0131 09:59:09.348194 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9p8q" event={"ID":"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf","Type":"ContainerStarted","Data":"d646e355f09d4a9a7e704acaac29c3857e97953c782c21c4a33b796499ac750a"} Jan 31 09:59:10 crc kubenswrapper[4783]: I0131 09:59:10.360208 4783 generic.go:334] "Generic (PLEG): container finished" podID="0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf" containerID="36fd00a89f1ca1d0bf6ecbc67ba71391172e15e3bfe9bd11ff65d072383f65b3" exitCode=0 Jan 31 09:59:10 crc kubenswrapper[4783]: I0131 09:59:10.360314 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9p8q" event={"ID":"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf","Type":"ContainerDied","Data":"36fd00a89f1ca1d0bf6ecbc67ba71391172e15e3bfe9bd11ff65d072383f65b3"} Jan 31 09:59:11 crc kubenswrapper[4783]: I0131 09:59:11.370569 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9p8q" event={"ID":"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf","Type":"ContainerStarted","Data":"32c2d3ef6d5778adf5a101d4c756bae3ad645a930123a954bfdf72050e941963"} Jan 31 09:59:11 crc kubenswrapper[4783]: I0131 09:59:11.387312 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m9p8q" podStartSLOduration=2.898452215 podStartE2EDuration="4.387295657s" podCreationTimestamp="2026-01-31 09:59:07 +0000 UTC" firstStartedPulling="2026-01-31 09:59:09.350942839 +0000 UTC m=+3260.019626307" lastFinishedPulling="2026-01-31 09:59:10.839786281 +0000 UTC m=+3261.508469749" observedRunningTime="2026-01-31 09:59:11.38213668 +0000 UTC m=+3262.050820148" watchObservedRunningTime="2026-01-31 09:59:11.387295657 +0000 UTC m=+3262.055979125" Jan 31 09:59:11 crc kubenswrapper[4783]: I0131 09:59:11.645932 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:59:11 crc kubenswrapper[4783]: E0131 09:59:11.646386 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:59:12 crc kubenswrapper[4783]: I0131 09:59:12.148570 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch_a8892df4-c6f7-42b1-b003-c7d359c74690/util/0.log" Jan 31 09:59:12 crc kubenswrapper[4783]: I0131 09:59:12.256675 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch_a8892df4-c6f7-42b1-b003-c7d359c74690/util/0.log" Jan 31 09:59:12 crc kubenswrapper[4783]: I0131 09:59:12.308004 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch_a8892df4-c6f7-42b1-b003-c7d359c74690/pull/0.log" Jan 31 09:59:12 crc kubenswrapper[4783]: I0131 09:59:12.308177 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch_a8892df4-c6f7-42b1-b003-c7d359c74690/pull/0.log" Jan 31 09:59:12 crc kubenswrapper[4783]: I0131 09:59:12.500261 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch_a8892df4-c6f7-42b1-b003-c7d359c74690/util/0.log" Jan 31 09:59:12 crc kubenswrapper[4783]: I0131 09:59:12.520255 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch_a8892df4-c6f7-42b1-b003-c7d359c74690/pull/0.log" Jan 31 09:59:12 crc kubenswrapper[4783]: I0131 09:59:12.521847 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcr9hch_a8892df4-c6f7-42b1-b003-c7d359c74690/extract/0.log" Jan 31 09:59:12 crc kubenswrapper[4783]: I0131 09:59:12.679856 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv_a19e1f79-11c0-430c-9bea-96c2878fde55/util/0.log" Jan 31 09:59:12 crc kubenswrapper[4783]: I0131 09:59:12.802915 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv_a19e1f79-11c0-430c-9bea-96c2878fde55/util/0.log" Jan 31 09:59:12 crc kubenswrapper[4783]: I0131 09:59:12.807969 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv_a19e1f79-11c0-430c-9bea-96c2878fde55/pull/0.log" Jan 31 09:59:12 crc kubenswrapper[4783]: I0131 09:59:12.813137 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv_a19e1f79-11c0-430c-9bea-96c2878fde55/pull/0.log" Jan 31 09:59:12 crc kubenswrapper[4783]: I0131 09:59:12.945721 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv_a19e1f79-11c0-430c-9bea-96c2878fde55/pull/0.log" Jan 31 09:59:12 crc kubenswrapper[4783]: I0131 09:59:12.947492 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv_a19e1f79-11c0-430c-9bea-96c2878fde55/util/0.log" Jan 31 09:59:12 crc kubenswrapper[4783]: I0131 09:59:12.948702 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135rmqv_a19e1f79-11c0-430c-9bea-96c2878fde55/extract/0.log" Jan 31 09:59:13 crc kubenswrapper[4783]: I0131 09:59:13.110915 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5dqb2_fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2/extract-utilities/0.log" Jan 31 09:59:13 crc kubenswrapper[4783]: I0131 09:59:13.255134 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5dqb2_fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2/extract-utilities/0.log" Jan 31 09:59:13 crc kubenswrapper[4783]: I0131 09:59:13.263971 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5dqb2_fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2/extract-content/0.log" Jan 31 09:59:13 crc kubenswrapper[4783]: I0131 09:59:13.415635 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5dqb2_fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2/extract-content/0.log" Jan 31 09:59:13 crc kubenswrapper[4783]: I0131 09:59:13.557144 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5dqb2_fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2/extract-utilities/0.log" Jan 31 09:59:13 crc kubenswrapper[4783]: I0131 09:59:13.558582 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5dqb2_fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2/extract-content/0.log" Jan 31 09:59:13 crc kubenswrapper[4783]: I0131 09:59:13.734630 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-729jp_2b509e05-1b13-486b-8986-6a343c3110b8/extract-utilities/0.log" Jan 31 09:59:13 crc kubenswrapper[4783]: I0131 09:59:13.976399 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5dqb2_fe131d5f-6dbb-43d5-8ff0-63b8d9e901a2/registry-server/0.log" Jan 31 09:59:13 crc kubenswrapper[4783]: I0131 09:59:13.978006 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-729jp_2b509e05-1b13-486b-8986-6a343c3110b8/extract-content/0.log" Jan 31 09:59:14 crc kubenswrapper[4783]: I0131 09:59:14.000783 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-729jp_2b509e05-1b13-486b-8986-6a343c3110b8/extract-content/0.log" Jan 31 09:59:14 crc kubenswrapper[4783]: I0131 09:59:14.020864 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-729jp_2b509e05-1b13-486b-8986-6a343c3110b8/extract-utilities/0.log" Jan 31 09:59:14 crc kubenswrapper[4783]: I0131 09:59:14.174648 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-729jp_2b509e05-1b13-486b-8986-6a343c3110b8/extract-content/0.log" Jan 31 09:59:14 crc kubenswrapper[4783]: I0131 09:59:14.185768 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-729jp_2b509e05-1b13-486b-8986-6a343c3110b8/extract-utilities/0.log" Jan 31 09:59:14 crc kubenswrapper[4783]: I0131 09:59:14.366290 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9p8q_0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf/extract-utilities/0.log" Jan 31 09:59:14 crc kubenswrapper[4783]: I0131 09:59:14.545513 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9p8q_0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf/extract-content/0.log" Jan 31 09:59:14 crc kubenswrapper[4783]: I0131 09:59:14.558818 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9p8q_0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf/extract-utilities/0.log" Jan 31 09:59:14 crc kubenswrapper[4783]: I0131 09:59:14.567438 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9p8q_0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf/extract-content/0.log" Jan 31 09:59:14 crc kubenswrapper[4783]: I0131 09:59:14.651357 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-729jp_2b509e05-1b13-486b-8986-6a343c3110b8/registry-server/0.log" Jan 31 09:59:14 crc kubenswrapper[4783]: I0131 09:59:14.760420 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9p8q_0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf/extract-content/0.log" Jan 31 09:59:14 crc kubenswrapper[4783]: I0131 09:59:14.795066 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9p8q_0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf/registry-server/0.log" Jan 31 09:59:14 crc kubenswrapper[4783]: I0131 09:59:14.808356 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m9p8q_0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf/extract-utilities/0.log" Jan 31 09:59:14 crc kubenswrapper[4783]: I0131 09:59:14.914419 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-729tq_6e7fa19e-aa64-4479-805e-62625ccc19b8/marketplace-operator/1.log" Jan 31 09:59:14 crc kubenswrapper[4783]: I0131 09:59:14.987176 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-729tq_6e7fa19e-aa64-4479-805e-62625ccc19b8/marketplace-operator/0.log" Jan 31 09:59:15 crc kubenswrapper[4783]: I0131 09:59:15.000062 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xm4z9_c7cddcda-dcff-4c7a-b437-2bd3750b9200/extract-utilities/0.log" Jan 31 09:59:15 crc kubenswrapper[4783]: I0131 09:59:15.163025 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xm4z9_c7cddcda-dcff-4c7a-b437-2bd3750b9200/extract-utilities/0.log" Jan 31 09:59:15 crc kubenswrapper[4783]: I0131 09:59:15.171972 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xm4z9_c7cddcda-dcff-4c7a-b437-2bd3750b9200/extract-content/0.log" Jan 31 09:59:15 crc kubenswrapper[4783]: I0131 09:59:15.189433 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xm4z9_c7cddcda-dcff-4c7a-b437-2bd3750b9200/extract-content/0.log" Jan 31 09:59:15 crc kubenswrapper[4783]: I0131 09:59:15.303415 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xm4z9_c7cddcda-dcff-4c7a-b437-2bd3750b9200/extract-utilities/0.log" Jan 31 09:59:15 crc kubenswrapper[4783]: I0131 09:59:15.348895 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xm4z9_c7cddcda-dcff-4c7a-b437-2bd3750b9200/extract-content/0.log" Jan 31 09:59:15 crc kubenswrapper[4783]: I0131 09:59:15.394512 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4fjj8_86ccd026-7c4f-4a84-8baa-45cfafa1abba/extract-utilities/0.log" Jan 31 09:59:15 crc kubenswrapper[4783]: I0131 09:59:15.401121 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xm4z9_c7cddcda-dcff-4c7a-b437-2bd3750b9200/registry-server/0.log" Jan 31 09:59:15 crc kubenswrapper[4783]: I0131 09:59:15.512525 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4fjj8_86ccd026-7c4f-4a84-8baa-45cfafa1abba/extract-utilities/0.log" Jan 31 09:59:15 crc kubenswrapper[4783]: I0131 09:59:15.514144 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4fjj8_86ccd026-7c4f-4a84-8baa-45cfafa1abba/extract-content/0.log" Jan 31 09:59:15 crc kubenswrapper[4783]: I0131 09:59:15.540767 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4fjj8_86ccd026-7c4f-4a84-8baa-45cfafa1abba/extract-content/0.log" Jan 31 09:59:15 crc kubenswrapper[4783]: I0131 09:59:15.701916 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4fjj8_86ccd026-7c4f-4a84-8baa-45cfafa1abba/extract-content/0.log" Jan 31 09:59:15 crc kubenswrapper[4783]: I0131 09:59:15.703502 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4fjj8_86ccd026-7c4f-4a84-8baa-45cfafa1abba/extract-utilities/0.log" Jan 31 09:59:16 crc kubenswrapper[4783]: I0131 09:59:16.029544 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4fjj8_86ccd026-7c4f-4a84-8baa-45cfafa1abba/registry-server/0.log" Jan 31 09:59:18 crc kubenswrapper[4783]: I0131 09:59:18.245768 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:18 crc kubenswrapper[4783]: I0131 09:59:18.246900 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:18 crc kubenswrapper[4783]: I0131 09:59:18.281010 4783 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:18 crc kubenswrapper[4783]: I0131 09:59:18.463019 4783 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:18 crc kubenswrapper[4783]: I0131 09:59:18.512630 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m9p8q"] Jan 31 09:59:20 crc kubenswrapper[4783]: I0131 09:59:20.434681 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m9p8q" podUID="0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf" containerName="registry-server" containerID="cri-o://32c2d3ef6d5778adf5a101d4c756bae3ad645a930123a954bfdf72050e941963" gracePeriod=2 Jan 31 09:59:20 crc kubenswrapper[4783]: I0131 09:59:20.780353 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:20 crc kubenswrapper[4783]: I0131 09:59:20.965827 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrwm4\" (UniqueName: \"kubernetes.io/projected/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-kube-api-access-lrwm4\") pod \"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf\" (UID: \"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf\") " Jan 31 09:59:20 crc kubenswrapper[4783]: I0131 09:59:20.966010 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-utilities\") pod \"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf\" (UID: \"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf\") " Jan 31 09:59:20 crc kubenswrapper[4783]: I0131 09:59:20.966134 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-catalog-content\") pod \"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf\" (UID: \"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf\") " Jan 31 09:59:20 crc kubenswrapper[4783]: I0131 09:59:20.966850 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-utilities" (OuterVolumeSpecName: "utilities") pod "0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf" (UID: "0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:59:20 crc kubenswrapper[4783]: I0131 09:59:20.971851 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-kube-api-access-lrwm4" (OuterVolumeSpecName: "kube-api-access-lrwm4") pod "0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf" (UID: "0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf"). InnerVolumeSpecName "kube-api-access-lrwm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.003053 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf" (UID: "0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.069228 4783 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.069295 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrwm4\" (UniqueName: \"kubernetes.io/projected/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-kube-api-access-lrwm4\") on node \"crc\" DevicePath \"\"" Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.069310 4783 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.444630 4783 generic.go:334] "Generic (PLEG): container finished" podID="0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf" containerID="32c2d3ef6d5778adf5a101d4c756bae3ad645a930123a954bfdf72050e941963" exitCode=0 Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.444728 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9p8q" Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.444769 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9p8q" event={"ID":"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf","Type":"ContainerDied","Data":"32c2d3ef6d5778adf5a101d4c756bae3ad645a930123a954bfdf72050e941963"} Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.446924 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9p8q" event={"ID":"0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf","Type":"ContainerDied","Data":"d646e355f09d4a9a7e704acaac29c3857e97953c782c21c4a33b796499ac750a"} Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.447003 4783 scope.go:117] "RemoveContainer" containerID="32c2d3ef6d5778adf5a101d4c756bae3ad645a930123a954bfdf72050e941963" Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.470274 4783 scope.go:117] "RemoveContainer" containerID="36fd00a89f1ca1d0bf6ecbc67ba71391172e15e3bfe9bd11ff65d072383f65b3" Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.478895 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m9p8q"] Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.487622 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m9p8q"] Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.499269 4783 scope.go:117] "RemoveContainer" containerID="4a1889ed6f5493f5ad65a1ef95263996b0777b5ce7ca8898626f31b19adfa8e2" Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.522308 4783 scope.go:117] "RemoveContainer" containerID="32c2d3ef6d5778adf5a101d4c756bae3ad645a930123a954bfdf72050e941963" Jan 31 09:59:21 crc kubenswrapper[4783]: E0131 09:59:21.522679 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32c2d3ef6d5778adf5a101d4c756bae3ad645a930123a954bfdf72050e941963\": container with ID starting with 32c2d3ef6d5778adf5a101d4c756bae3ad645a930123a954bfdf72050e941963 not found: ID does not exist" containerID="32c2d3ef6d5778adf5a101d4c756bae3ad645a930123a954bfdf72050e941963" Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.522724 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32c2d3ef6d5778adf5a101d4c756bae3ad645a930123a954bfdf72050e941963"} err="failed to get container status \"32c2d3ef6d5778adf5a101d4c756bae3ad645a930123a954bfdf72050e941963\": rpc error: code = NotFound desc = could not find container \"32c2d3ef6d5778adf5a101d4c756bae3ad645a930123a954bfdf72050e941963\": container with ID starting with 32c2d3ef6d5778adf5a101d4c756bae3ad645a930123a954bfdf72050e941963 not found: ID does not exist" Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.522754 4783 scope.go:117] "RemoveContainer" containerID="36fd00a89f1ca1d0bf6ecbc67ba71391172e15e3bfe9bd11ff65d072383f65b3" Jan 31 09:59:21 crc kubenswrapper[4783]: E0131 09:59:21.523245 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36fd00a89f1ca1d0bf6ecbc67ba71391172e15e3bfe9bd11ff65d072383f65b3\": container with ID starting with 36fd00a89f1ca1d0bf6ecbc67ba71391172e15e3bfe9bd11ff65d072383f65b3 not found: ID does not exist" containerID="36fd00a89f1ca1d0bf6ecbc67ba71391172e15e3bfe9bd11ff65d072383f65b3" Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.523362 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36fd00a89f1ca1d0bf6ecbc67ba71391172e15e3bfe9bd11ff65d072383f65b3"} err="failed to get container status \"36fd00a89f1ca1d0bf6ecbc67ba71391172e15e3bfe9bd11ff65d072383f65b3\": rpc error: code = NotFound desc = could not find container \"36fd00a89f1ca1d0bf6ecbc67ba71391172e15e3bfe9bd11ff65d072383f65b3\": container with ID starting with 36fd00a89f1ca1d0bf6ecbc67ba71391172e15e3bfe9bd11ff65d072383f65b3 not found: ID does not exist" Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.523439 4783 scope.go:117] "RemoveContainer" containerID="4a1889ed6f5493f5ad65a1ef95263996b0777b5ce7ca8898626f31b19adfa8e2" Jan 31 09:59:21 crc kubenswrapper[4783]: E0131 09:59:21.523783 4783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a1889ed6f5493f5ad65a1ef95263996b0777b5ce7ca8898626f31b19adfa8e2\": container with ID starting with 4a1889ed6f5493f5ad65a1ef95263996b0777b5ce7ca8898626f31b19adfa8e2 not found: ID does not exist" containerID="4a1889ed6f5493f5ad65a1ef95263996b0777b5ce7ca8898626f31b19adfa8e2" Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.523814 4783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a1889ed6f5493f5ad65a1ef95263996b0777b5ce7ca8898626f31b19adfa8e2"} err="failed to get container status \"4a1889ed6f5493f5ad65a1ef95263996b0777b5ce7ca8898626f31b19adfa8e2\": rpc error: code = NotFound desc = could not find container \"4a1889ed6f5493f5ad65a1ef95263996b0777b5ce7ca8898626f31b19adfa8e2\": container with ID starting with 4a1889ed6f5493f5ad65a1ef95263996b0777b5ce7ca8898626f31b19adfa8e2 not found: ID does not exist" Jan 31 09:59:21 crc kubenswrapper[4783]: I0131 09:59:21.654804 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf" path="/var/lib/kubelet/pods/0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf/volumes" Jan 31 09:59:24 crc kubenswrapper[4783]: I0131 09:59:24.645942 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:59:24 crc kubenswrapper[4783]: E0131 09:59:24.646589 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:59:37 crc kubenswrapper[4783]: I0131 09:59:37.646928 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:59:37 crc kubenswrapper[4783]: E0131 09:59:37.647652 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 09:59:50 crc kubenswrapper[4783]: I0131 09:59:50.646537 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 09:59:50 crc kubenswrapper[4783]: E0131 09:59:50.647450 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.136050 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx"] Jan 31 10:00:00 crc kubenswrapper[4783]: E0131 10:00:00.137120 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf" containerName="registry-server" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.137135 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf" containerName="registry-server" Jan 31 10:00:00 crc kubenswrapper[4783]: E0131 10:00:00.137148 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf" containerName="extract-utilities" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.137153 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf" containerName="extract-utilities" Jan 31 10:00:00 crc kubenswrapper[4783]: E0131 10:00:00.137194 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf" containerName="extract-content" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.137202 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf" containerName="extract-content" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.137449 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="0423ac62-e6a8-4beb-b0c5-c7ca6aad5acf" containerName="registry-server" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.138091 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.139619 4783 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.139734 4783 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.147040 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx"] Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.303150 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcxq7\" (UniqueName: \"kubernetes.io/projected/612f73dc-5f89-4a93-a4d5-953051e4d517-kube-api-access-qcxq7\") pod \"collect-profiles-29497560-bldcx\" (UID: \"612f73dc-5f89-4a93-a4d5-953051e4d517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.303432 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/612f73dc-5f89-4a93-a4d5-953051e4d517-secret-volume\") pod \"collect-profiles-29497560-bldcx\" (UID: \"612f73dc-5f89-4a93-a4d5-953051e4d517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.303531 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/612f73dc-5f89-4a93-a4d5-953051e4d517-config-volume\") pod \"collect-profiles-29497560-bldcx\" (UID: \"612f73dc-5f89-4a93-a4d5-953051e4d517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.404666 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcxq7\" (UniqueName: \"kubernetes.io/projected/612f73dc-5f89-4a93-a4d5-953051e4d517-kube-api-access-qcxq7\") pod \"collect-profiles-29497560-bldcx\" (UID: \"612f73dc-5f89-4a93-a4d5-953051e4d517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.404710 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/612f73dc-5f89-4a93-a4d5-953051e4d517-secret-volume\") pod \"collect-profiles-29497560-bldcx\" (UID: \"612f73dc-5f89-4a93-a4d5-953051e4d517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.404748 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/612f73dc-5f89-4a93-a4d5-953051e4d517-config-volume\") pod \"collect-profiles-29497560-bldcx\" (UID: \"612f73dc-5f89-4a93-a4d5-953051e4d517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.405583 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/612f73dc-5f89-4a93-a4d5-953051e4d517-config-volume\") pod \"collect-profiles-29497560-bldcx\" (UID: \"612f73dc-5f89-4a93-a4d5-953051e4d517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.419279 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/612f73dc-5f89-4a93-a4d5-953051e4d517-secret-volume\") pod \"collect-profiles-29497560-bldcx\" (UID: \"612f73dc-5f89-4a93-a4d5-953051e4d517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.419991 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcxq7\" (UniqueName: \"kubernetes.io/projected/612f73dc-5f89-4a93-a4d5-953051e4d517-kube-api-access-qcxq7\") pod \"collect-profiles-29497560-bldcx\" (UID: \"612f73dc-5f89-4a93-a4d5-953051e4d517\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.457194 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx" Jan 31 10:00:00 crc kubenswrapper[4783]: I0131 10:00:00.867900 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx"] Jan 31 10:00:01 crc kubenswrapper[4783]: I0131 10:00:01.775735 4783 generic.go:334] "Generic (PLEG): container finished" podID="612f73dc-5f89-4a93-a4d5-953051e4d517" containerID="3f6b7dec9ad2418e9979192a3fbf65defb939ced42e594d8affab32149c8a25d" exitCode=0 Jan 31 10:00:01 crc kubenswrapper[4783]: I0131 10:00:01.775795 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx" event={"ID":"612f73dc-5f89-4a93-a4d5-953051e4d517","Type":"ContainerDied","Data":"3f6b7dec9ad2418e9979192a3fbf65defb939ced42e594d8affab32149c8a25d"} Jan 31 10:00:01 crc kubenswrapper[4783]: I0131 10:00:01.776111 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx" event={"ID":"612f73dc-5f89-4a93-a4d5-953051e4d517","Type":"ContainerStarted","Data":"d4c37dfc4fd1e530535c6624315d4453f79f828d2e166b90f2be09ab884cf4e5"} Jan 31 10:00:02 crc kubenswrapper[4783]: I0131 10:00:02.646411 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 10:00:02 crc kubenswrapper[4783]: E0131 10:00:02.646874 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 10:00:03 crc kubenswrapper[4783]: I0131 10:00:03.085771 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx" Jan 31 10:00:03 crc kubenswrapper[4783]: I0131 10:00:03.167829 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/612f73dc-5f89-4a93-a4d5-953051e4d517-config-volume\") pod \"612f73dc-5f89-4a93-a4d5-953051e4d517\" (UID: \"612f73dc-5f89-4a93-a4d5-953051e4d517\") " Jan 31 10:00:03 crc kubenswrapper[4783]: I0131 10:00:03.168761 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/612f73dc-5f89-4a93-a4d5-953051e4d517-config-volume" (OuterVolumeSpecName: "config-volume") pod "612f73dc-5f89-4a93-a4d5-953051e4d517" (UID: "612f73dc-5f89-4a93-a4d5-953051e4d517"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:00:03 crc kubenswrapper[4783]: I0131 10:00:03.270624 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/612f73dc-5f89-4a93-a4d5-953051e4d517-secret-volume\") pod \"612f73dc-5f89-4a93-a4d5-953051e4d517\" (UID: \"612f73dc-5f89-4a93-a4d5-953051e4d517\") " Jan 31 10:00:03 crc kubenswrapper[4783]: I0131 10:00:03.270717 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcxq7\" (UniqueName: \"kubernetes.io/projected/612f73dc-5f89-4a93-a4d5-953051e4d517-kube-api-access-qcxq7\") pod \"612f73dc-5f89-4a93-a4d5-953051e4d517\" (UID: \"612f73dc-5f89-4a93-a4d5-953051e4d517\") " Jan 31 10:00:03 crc kubenswrapper[4783]: I0131 10:00:03.275978 4783 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/612f73dc-5f89-4a93-a4d5-953051e4d517-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 10:00:03 crc kubenswrapper[4783]: I0131 10:00:03.284394 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612f73dc-5f89-4a93-a4d5-953051e4d517-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "612f73dc-5f89-4a93-a4d5-953051e4d517" (UID: "612f73dc-5f89-4a93-a4d5-953051e4d517"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:00:03 crc kubenswrapper[4783]: I0131 10:00:03.284573 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/612f73dc-5f89-4a93-a4d5-953051e4d517-kube-api-access-qcxq7" (OuterVolumeSpecName: "kube-api-access-qcxq7") pod "612f73dc-5f89-4a93-a4d5-953051e4d517" (UID: "612f73dc-5f89-4a93-a4d5-953051e4d517"). InnerVolumeSpecName "kube-api-access-qcxq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:00:03 crc kubenswrapper[4783]: I0131 10:00:03.379135 4783 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/612f73dc-5f89-4a93-a4d5-953051e4d517-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 10:00:03 crc kubenswrapper[4783]: I0131 10:00:03.379200 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcxq7\" (UniqueName: \"kubernetes.io/projected/612f73dc-5f89-4a93-a4d5-953051e4d517-kube-api-access-qcxq7\") on node \"crc\" DevicePath \"\"" Jan 31 10:00:03 crc kubenswrapper[4783]: I0131 10:00:03.795399 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx" event={"ID":"612f73dc-5f89-4a93-a4d5-953051e4d517","Type":"ContainerDied","Data":"d4c37dfc4fd1e530535c6624315d4453f79f828d2e166b90f2be09ab884cf4e5"} Jan 31 10:00:03 crc kubenswrapper[4783]: I0131 10:00:03.795673 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4c37dfc4fd1e530535c6624315d4453f79f828d2e166b90f2be09ab884cf4e5" Jan 31 10:00:03 crc kubenswrapper[4783]: I0131 10:00:03.795476 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-bldcx" Jan 31 10:00:04 crc kubenswrapper[4783]: I0131 10:00:04.166371 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd"] Jan 31 10:00:04 crc kubenswrapper[4783]: I0131 10:00:04.178606 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497515-dbtqd"] Jan 31 10:00:05 crc kubenswrapper[4783]: I0131 10:00:05.660632 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc0f6d8f-df21-48f3-817a-42e0363934cb" path="/var/lib/kubelet/pods/cc0f6d8f-df21-48f3-817a-42e0363934cb/volumes" Jan 31 10:00:13 crc kubenswrapper[4783]: I0131 10:00:13.646267 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 10:00:13 crc kubenswrapper[4783]: E0131 10:00:13.647583 4783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqnx9_openshift-machine-config-operator(fb43cc7e-a0e2-4518-b732-3410c4d4cb5b)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" Jan 31 10:00:25 crc kubenswrapper[4783]: I0131 10:00:25.646286 4783 scope.go:117] "RemoveContainer" containerID="a27b46cbd4e486661c2521f74fcae6c4076e120bf4da8d9e1ca6934ba3f53ac6" Jan 31 10:00:25 crc kubenswrapper[4783]: I0131 10:00:25.987781 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" event={"ID":"fb43cc7e-a0e2-4518-b732-3410c4d4cb5b","Type":"ContainerStarted","Data":"4fcc6c8c0f54df85f2f6195345e5ea7ed58201b8cb8db0b43a9462a85f589cb1"} Jan 31 10:00:39 crc kubenswrapper[4783]: I0131 10:00:39.092666 4783 generic.go:334] "Generic (PLEG): container finished" podID="2eab12dc-dad2-43be-9823-6a868e74f9a0" containerID="0023ccdb4626cadbe827315480b86721ff80debe53267fc3af009699d8ac3015" exitCode=0 Jan 31 10:00:39 crc kubenswrapper[4783]: I0131 10:00:39.092732 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c8zqw/must-gather-fxhdm" event={"ID":"2eab12dc-dad2-43be-9823-6a868e74f9a0","Type":"ContainerDied","Data":"0023ccdb4626cadbe827315480b86721ff80debe53267fc3af009699d8ac3015"} Jan 31 10:00:39 crc kubenswrapper[4783]: I0131 10:00:39.093950 4783 scope.go:117] "RemoveContainer" containerID="0023ccdb4626cadbe827315480b86721ff80debe53267fc3af009699d8ac3015" Jan 31 10:00:39 crc kubenswrapper[4783]: I0131 10:00:39.649396 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c8zqw_must-gather-fxhdm_2eab12dc-dad2-43be-9823-6a868e74f9a0/gather/0.log" Jan 31 10:00:49 crc kubenswrapper[4783]: I0131 10:00:49.016937 4783 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c8zqw/must-gather-fxhdm"] Jan 31 10:00:49 crc kubenswrapper[4783]: I0131 10:00:49.017827 4783 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-c8zqw/must-gather-fxhdm" podUID="2eab12dc-dad2-43be-9823-6a868e74f9a0" containerName="copy" containerID="cri-o://27aa26442f30696e4407e07f8176023dde9011920e2735d65ba90d5d8ce0ae78" gracePeriod=2 Jan 31 10:00:49 crc kubenswrapper[4783]: I0131 10:00:49.036554 4783 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c8zqw/must-gather-fxhdm"] Jan 31 10:00:49 crc kubenswrapper[4783]: I0131 10:00:49.183206 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c8zqw_must-gather-fxhdm_2eab12dc-dad2-43be-9823-6a868e74f9a0/copy/0.log" Jan 31 10:00:49 crc kubenswrapper[4783]: I0131 10:00:49.183576 4783 generic.go:334] "Generic (PLEG): container finished" podID="2eab12dc-dad2-43be-9823-6a868e74f9a0" containerID="27aa26442f30696e4407e07f8176023dde9011920e2735d65ba90d5d8ce0ae78" exitCode=143 Jan 31 10:00:49 crc kubenswrapper[4783]: I0131 10:00:49.450470 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c8zqw_must-gather-fxhdm_2eab12dc-dad2-43be-9823-6a868e74f9a0/copy/0.log" Jan 31 10:00:49 crc kubenswrapper[4783]: I0131 10:00:49.451251 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c8zqw/must-gather-fxhdm" Jan 31 10:00:49 crc kubenswrapper[4783]: I0131 10:00:49.614320 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2eab12dc-dad2-43be-9823-6a868e74f9a0-must-gather-output\") pod \"2eab12dc-dad2-43be-9823-6a868e74f9a0\" (UID: \"2eab12dc-dad2-43be-9823-6a868e74f9a0\") " Jan 31 10:00:49 crc kubenswrapper[4783]: I0131 10:00:49.614420 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwgpz\" (UniqueName: \"kubernetes.io/projected/2eab12dc-dad2-43be-9823-6a868e74f9a0-kube-api-access-zwgpz\") pod \"2eab12dc-dad2-43be-9823-6a868e74f9a0\" (UID: \"2eab12dc-dad2-43be-9823-6a868e74f9a0\") " Jan 31 10:00:49 crc kubenswrapper[4783]: I0131 10:00:49.620707 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eab12dc-dad2-43be-9823-6a868e74f9a0-kube-api-access-zwgpz" (OuterVolumeSpecName: "kube-api-access-zwgpz") pod "2eab12dc-dad2-43be-9823-6a868e74f9a0" (UID: "2eab12dc-dad2-43be-9823-6a868e74f9a0"). InnerVolumeSpecName "kube-api-access-zwgpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:00:49 crc kubenswrapper[4783]: I0131 10:00:49.719090 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwgpz\" (UniqueName: \"kubernetes.io/projected/2eab12dc-dad2-43be-9823-6a868e74f9a0-kube-api-access-zwgpz\") on node \"crc\" DevicePath \"\"" Jan 31 10:00:49 crc kubenswrapper[4783]: I0131 10:00:49.761495 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eab12dc-dad2-43be-9823-6a868e74f9a0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2eab12dc-dad2-43be-9823-6a868e74f9a0" (UID: "2eab12dc-dad2-43be-9823-6a868e74f9a0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:00:49 crc kubenswrapper[4783]: I0131 10:00:49.820418 4783 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2eab12dc-dad2-43be-9823-6a868e74f9a0-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 31 10:00:50 crc kubenswrapper[4783]: I0131 10:00:50.191621 4783 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c8zqw_must-gather-fxhdm_2eab12dc-dad2-43be-9823-6a868e74f9a0/copy/0.log" Jan 31 10:00:50 crc kubenswrapper[4783]: I0131 10:00:50.192082 4783 scope.go:117] "RemoveContainer" containerID="27aa26442f30696e4407e07f8176023dde9011920e2735d65ba90d5d8ce0ae78" Jan 31 10:00:50 crc kubenswrapper[4783]: I0131 10:00:50.192141 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c8zqw/must-gather-fxhdm" Jan 31 10:00:50 crc kubenswrapper[4783]: I0131 10:00:50.219217 4783 scope.go:117] "RemoveContainer" containerID="0023ccdb4626cadbe827315480b86721ff80debe53267fc3af009699d8ac3015" Jan 31 10:00:51 crc kubenswrapper[4783]: I0131 10:00:51.660335 4783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eab12dc-dad2-43be-9823-6a868e74f9a0" path="/var/lib/kubelet/pods/2eab12dc-dad2-43be-9823-6a868e74f9a0/volumes" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.142644 4783 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29497561-hrgqf"] Jan 31 10:01:00 crc kubenswrapper[4783]: E0131 10:01:00.143614 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eab12dc-dad2-43be-9823-6a868e74f9a0" containerName="gather" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.143632 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eab12dc-dad2-43be-9823-6a868e74f9a0" containerName="gather" Jan 31 10:01:00 crc kubenswrapper[4783]: E0131 10:01:00.143645 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eab12dc-dad2-43be-9823-6a868e74f9a0" containerName="copy" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.143652 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eab12dc-dad2-43be-9823-6a868e74f9a0" containerName="copy" Jan 31 10:01:00 crc kubenswrapper[4783]: E0131 10:01:00.143671 4783 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="612f73dc-5f89-4a93-a4d5-953051e4d517" containerName="collect-profiles" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.143677 4783 state_mem.go:107] "Deleted CPUSet assignment" podUID="612f73dc-5f89-4a93-a4d5-953051e4d517" containerName="collect-profiles" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.143882 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eab12dc-dad2-43be-9823-6a868e74f9a0" containerName="copy" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.143903 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eab12dc-dad2-43be-9823-6a868e74f9a0" containerName="gather" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.143918 4783 memory_manager.go:354] "RemoveStaleState removing state" podUID="612f73dc-5f89-4a93-a4d5-953051e4d517" containerName="collect-profiles" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.144620 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29497561-hrgqf" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.150510 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29497561-hrgqf"] Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.265131 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94rrw\" (UniqueName: \"kubernetes.io/projected/eb6cd35a-5fc0-4416-8945-5b778691cae4-kube-api-access-94rrw\") pod \"keystone-cron-29497561-hrgqf\" (UID: \"eb6cd35a-5fc0-4416-8945-5b778691cae4\") " pod="openstack/keystone-cron-29497561-hrgqf" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.265236 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-config-data\") pod \"keystone-cron-29497561-hrgqf\" (UID: \"eb6cd35a-5fc0-4416-8945-5b778691cae4\") " pod="openstack/keystone-cron-29497561-hrgqf" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.265339 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-combined-ca-bundle\") pod \"keystone-cron-29497561-hrgqf\" (UID: \"eb6cd35a-5fc0-4416-8945-5b778691cae4\") " pod="openstack/keystone-cron-29497561-hrgqf" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.265378 4783 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-fernet-keys\") pod \"keystone-cron-29497561-hrgqf\" (UID: \"eb6cd35a-5fc0-4416-8945-5b778691cae4\") " pod="openstack/keystone-cron-29497561-hrgqf" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.367840 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94rrw\" (UniqueName: \"kubernetes.io/projected/eb6cd35a-5fc0-4416-8945-5b778691cae4-kube-api-access-94rrw\") pod \"keystone-cron-29497561-hrgqf\" (UID: \"eb6cd35a-5fc0-4416-8945-5b778691cae4\") " pod="openstack/keystone-cron-29497561-hrgqf" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.368241 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-config-data\") pod \"keystone-cron-29497561-hrgqf\" (UID: \"eb6cd35a-5fc0-4416-8945-5b778691cae4\") " pod="openstack/keystone-cron-29497561-hrgqf" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.368339 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-combined-ca-bundle\") pod \"keystone-cron-29497561-hrgqf\" (UID: \"eb6cd35a-5fc0-4416-8945-5b778691cae4\") " pod="openstack/keystone-cron-29497561-hrgqf" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.368386 4783 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-fernet-keys\") pod \"keystone-cron-29497561-hrgqf\" (UID: \"eb6cd35a-5fc0-4416-8945-5b778691cae4\") " pod="openstack/keystone-cron-29497561-hrgqf" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.374775 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-combined-ca-bundle\") pod \"keystone-cron-29497561-hrgqf\" (UID: \"eb6cd35a-5fc0-4416-8945-5b778691cae4\") " pod="openstack/keystone-cron-29497561-hrgqf" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.374976 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-fernet-keys\") pod \"keystone-cron-29497561-hrgqf\" (UID: \"eb6cd35a-5fc0-4416-8945-5b778691cae4\") " pod="openstack/keystone-cron-29497561-hrgqf" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.382554 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94rrw\" (UniqueName: \"kubernetes.io/projected/eb6cd35a-5fc0-4416-8945-5b778691cae4-kube-api-access-94rrw\") pod \"keystone-cron-29497561-hrgqf\" (UID: \"eb6cd35a-5fc0-4416-8945-5b778691cae4\") " pod="openstack/keystone-cron-29497561-hrgqf" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.391181 4783 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-config-data\") pod \"keystone-cron-29497561-hrgqf\" (UID: \"eb6cd35a-5fc0-4416-8945-5b778691cae4\") " pod="openstack/keystone-cron-29497561-hrgqf" Jan 31 10:01:00 crc kubenswrapper[4783]: I0131 10:01:00.466112 4783 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29497561-hrgqf" Jan 31 10:01:01 crc kubenswrapper[4783]: I0131 10:01:00.884153 4783 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29497561-hrgqf"] Jan 31 10:01:01 crc kubenswrapper[4783]: I0131 10:01:01.283274 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29497561-hrgqf" event={"ID":"eb6cd35a-5fc0-4416-8945-5b778691cae4","Type":"ContainerStarted","Data":"49bb498f51f0ffb9512b415535f6bc547d379ee7e888df28497eccce6d4cc672"} Jan 31 10:01:01 crc kubenswrapper[4783]: I0131 10:01:01.283523 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29497561-hrgqf" event={"ID":"eb6cd35a-5fc0-4416-8945-5b778691cae4","Type":"ContainerStarted","Data":"c814dc2d152fb2af62b62af98083f12a3ecbb9f7c2cb5327847d5642611a51ca"} Jan 31 10:01:01 crc kubenswrapper[4783]: I0131 10:01:01.304665 4783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29497561-hrgqf" podStartSLOduration=1.30465454 podStartE2EDuration="1.30465454s" podCreationTimestamp="2026-01-31 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 10:01:01.297083647 +0000 UTC m=+3371.965767125" watchObservedRunningTime="2026-01-31 10:01:01.30465454 +0000 UTC m=+3371.973338009" Jan 31 10:01:01 crc kubenswrapper[4783]: I0131 10:01:01.551239 4783 scope.go:117] "RemoveContainer" containerID="54749afc2bd331aaecae5feba5355cdd60b03efd7a49531d827f62dcee5c84ce" Jan 31 10:01:03 crc kubenswrapper[4783]: I0131 10:01:03.299070 4783 generic.go:334] "Generic (PLEG): container finished" podID="eb6cd35a-5fc0-4416-8945-5b778691cae4" containerID="49bb498f51f0ffb9512b415535f6bc547d379ee7e888df28497eccce6d4cc672" exitCode=0 Jan 31 10:01:03 crc kubenswrapper[4783]: I0131 10:01:03.299310 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29497561-hrgqf" event={"ID":"eb6cd35a-5fc0-4416-8945-5b778691cae4","Type":"ContainerDied","Data":"49bb498f51f0ffb9512b415535f6bc547d379ee7e888df28497eccce6d4cc672"} Jan 31 10:01:04 crc kubenswrapper[4783]: I0131 10:01:04.605438 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29497561-hrgqf" Jan 31 10:01:04 crc kubenswrapper[4783]: I0131 10:01:04.635881 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-config-data\") pod \"eb6cd35a-5fc0-4416-8945-5b778691cae4\" (UID: \"eb6cd35a-5fc0-4416-8945-5b778691cae4\") " Jan 31 10:01:04 crc kubenswrapper[4783]: I0131 10:01:04.636090 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-combined-ca-bundle\") pod \"eb6cd35a-5fc0-4416-8945-5b778691cae4\" (UID: \"eb6cd35a-5fc0-4416-8945-5b778691cae4\") " Jan 31 10:01:04 crc kubenswrapper[4783]: I0131 10:01:04.636145 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94rrw\" (UniqueName: \"kubernetes.io/projected/eb6cd35a-5fc0-4416-8945-5b778691cae4-kube-api-access-94rrw\") pod \"eb6cd35a-5fc0-4416-8945-5b778691cae4\" (UID: \"eb6cd35a-5fc0-4416-8945-5b778691cae4\") " Jan 31 10:01:04 crc kubenswrapper[4783]: I0131 10:01:04.636428 4783 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-fernet-keys\") pod \"eb6cd35a-5fc0-4416-8945-5b778691cae4\" (UID: \"eb6cd35a-5fc0-4416-8945-5b778691cae4\") " Jan 31 10:01:04 crc kubenswrapper[4783]: I0131 10:01:04.642711 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "eb6cd35a-5fc0-4416-8945-5b778691cae4" (UID: "eb6cd35a-5fc0-4416-8945-5b778691cae4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:01:04 crc kubenswrapper[4783]: I0131 10:01:04.646506 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb6cd35a-5fc0-4416-8945-5b778691cae4-kube-api-access-94rrw" (OuterVolumeSpecName: "kube-api-access-94rrw") pod "eb6cd35a-5fc0-4416-8945-5b778691cae4" (UID: "eb6cd35a-5fc0-4416-8945-5b778691cae4"). InnerVolumeSpecName "kube-api-access-94rrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:01:04 crc kubenswrapper[4783]: I0131 10:01:04.663855 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb6cd35a-5fc0-4416-8945-5b778691cae4" (UID: "eb6cd35a-5fc0-4416-8945-5b778691cae4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:01:04 crc kubenswrapper[4783]: I0131 10:01:04.677321 4783 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-config-data" (OuterVolumeSpecName: "config-data") pod "eb6cd35a-5fc0-4416-8945-5b778691cae4" (UID: "eb6cd35a-5fc0-4416-8945-5b778691cae4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:01:04 crc kubenswrapper[4783]: I0131 10:01:04.740254 4783 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:01:04 crc kubenswrapper[4783]: I0131 10:01:04.740497 4783 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94rrw\" (UniqueName: \"kubernetes.io/projected/eb6cd35a-5fc0-4416-8945-5b778691cae4-kube-api-access-94rrw\") on node \"crc\" DevicePath \"\"" Jan 31 10:01:04 crc kubenswrapper[4783]: I0131 10:01:04.740569 4783 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 31 10:01:04 crc kubenswrapper[4783]: I0131 10:01:04.740630 4783 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb6cd35a-5fc0-4416-8945-5b778691cae4-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 10:01:05 crc kubenswrapper[4783]: I0131 10:01:05.320996 4783 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29497561-hrgqf" event={"ID":"eb6cd35a-5fc0-4416-8945-5b778691cae4","Type":"ContainerDied","Data":"c814dc2d152fb2af62b62af98083f12a3ecbb9f7c2cb5327847d5642611a51ca"} Jan 31 10:01:05 crc kubenswrapper[4783]: I0131 10:01:05.321053 4783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c814dc2d152fb2af62b62af98083f12a3ecbb9f7c2cb5327847d5642611a51ca" Jan 31 10:01:05 crc kubenswrapper[4783]: I0131 10:01:05.321130 4783 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29497561-hrgqf" Jan 31 10:02:47 crc kubenswrapper[4783]: I0131 10:02:47.756611 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:02:47 crc kubenswrapper[4783]: I0131 10:02:47.757378 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:03:01 crc kubenswrapper[4783]: I0131 10:03:01.621527 4783 scope.go:117] "RemoveContainer" containerID="6cc836bdfdf9ad1da72bacd2bf1f3a97bb87c9d01d218ff1c9ef04734b7634e1" Jan 31 10:03:17 crc kubenswrapper[4783]: I0131 10:03:17.756409 4783 patch_prober.go:28] interesting pod/machine-config-daemon-bqnx9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:03:17 crc kubenswrapper[4783]: I0131 10:03:17.756982 4783 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqnx9" podUID="fb43cc7e-a0e2-4518-b732-3410c4d4cb5b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"